For the most part, the only jobs that hire for English and other liberal-arts degrees are universities and schools. Even so, we're graduating people at rates comparable with STEM fields. With so much supply, the price of the work is driven far down.
This is such nonsense. Most jobs don't care what degree you have as long as you have a degree, and with an English degree you can apply for 90% of jobs. You could graduate and go on to be anything from an advertising executive, to a soldier. Most jobs don't care - it's technology that's unusual.
However, that's different than saying you can get hired as an English major vs saying these companies hire for English majors.
Engineering and CS are majors that companies specifically target, in large numbers for hiring. "Related" majors like Math, Physics, other sciences, may also get recruited in the same batch.
Although English is sometimes specified as a specifically targeted major for a job, this is considerably less common.
In your industry.
Other jobs are pickier.
My company offered an internship to someone who was 17. At Digital Ocean one of the interns was 16.
Tech moves quick and the less outside responsibilities you have the more time you can commit to building and learning.
Turns out when companies are making money, they want to hire engineers to build new stuff. When they are losing money, they want to hire engineers to automate.
There were certainly some lucky breaks involved. I like to think of it as I'm lucky the doors opened, it wasn't luck that had me searching for open doors and walking through them when I found them.
As far as I know, people could get by with a high school degree in the 1970s: there was no significant wage difference between a high school graduate and a college graduate, and both faced similar rates of unemployment. Nowadays employers take for granted that candidates must have some form of qualification, even for positions that don't seem to require it. Has the quality of education fallen and the grades been inflated to such a point where degrees are no longer reliable signals of productivity?
And why should companies have their HR departments thoroughly vet candidates when they can rely on the college admissions process to do it for them?
Free vetting AND loyal slave... uh... I mean... employees! Who could say no to that!?
I would much rather hire someone who didn't finish college but has had five years of progressive responsibility in the right area, than someone who had just finished a four or five year degree course but had never worked.
I think you can tease out three possible factors influencing the situation here:
- more people graduating, more supply, means that even entry-level jobs are flooded with college graduates
- entry-level jobs in decline, either due to automation or companies just not interested in providing apprenticeships
- student loans shifting the power dynamics between employer/employee; you're more pressed to get a job, any job, so you're more likely to take up positions where you're overqualified and underpaid
These scenarios all form positive feedback loops to each other; with more high-skilled worker supply in the market, companies can grow their non-entry level jobs while automating the low-level ones; this reduction in entry jobs means that the desperate college students compete more fiercely with high school students, raising the bar for entry level jobs; the raised bar for entry level jobs forces more high school students to go into college rather than start work, which saddles them with student debt, feeding into the cycle.
I've noticed that many social situations tend to end up in such vicious, entangled circles. My dad used to call these the "downward spiral of failure" and the "upward spiral of success", and I really don't know how you transition from one to the other without a monumental effort or some kind of miracle breakthrough.
Also, your crosswise comparison between an experienced candidate without a degree and an inexperienced candidate with a degree only tells me that you put more weight into experience than having a degree. It really doesn't tell me much about whether a degree is a good signal, but only that it's worse than experience. Now, if you had told me that you don't consider whether a candidate has a degree at all then I'd be able to infer that they are terrible signals and carry no information at all.
You've got a space of time in a person's life between, presumably, graduating high school and sending you a resume. What did they do with that time?
Merely having a degree doesn't say much.
Merely not having a degree doesn't say much.
Having an honors BS in CS from CMU, MIT, Stanford, or a bunch of other schools implies academic competence and exposure to a certain range of ideas. But I don't know that they can be productive outside of that environment.
Having an ordinary BS in a STEM subject from a random college that I've never heard of means even less to me. But is it zero? No.
Holding down a job for those four years is a signal, too, and it needs evaluation. What kind of job? While living at home? Did anything progress during that time? Is it relevant to what we're trying to hire for?
A proxy for an IQ test?
I live in LA and, surprise surprise, there are a lot of writers, artists, and other folks who want to work in Liberal Arts oriented industries. Most of my Liberal Arts degree equipped friends are in sales while dabbling in their art on the side. A few of them sell a book or script every 3-5 years. The rest of them went back to school for a Law Degree or an unrelated Masters degree.
Which isn't to say that the issue is unsolvable. But it still needs solving. I don't think anyone has paid much attention to it.
If you are selling a product to a government or a big enterprise, it helps to have a lot of people with impressive credentials in your staff.
As a hypothetical, let's say that most jobs require a degree. Let's say that you use 50% of the stuff in your English Major throughout most of your career; you've eaten a 50% inefficiency on your college investment(essentially doubling the "price/skills" of your degree), you've lost time that you could have invested in obtaining those skills. Further, you are pressured into demanding a higher salary because of your college investment, which isn't matched by the skills you've received; so either your employer carries the burden of your inefficient education, or you do.
This is usually decided by the power dynamic between the two, and if you're a broke college student desperate for a job, you don't have much bargaining power, so you're shouldering all the burden. The worst part is, there's no way for you to fix this - there's no 2-year program that is 100% aimed towards your career goal, and there's no entry-level job for you to start at 0%(you're probably already starting at the entry-level even with your 50% degree). So you're stuck making this investment whether you want to or not.
One way to look at university is insurance - you're learning all these extra things as an insurance that you'll have a baseline of skills if you change your career. The issue is, not everyone can afford such an expensive insurance, but most people are forced to take it under our hypothetical.
Bryan Caplan has a book coming out soon on the signalling model of education here's notes from a course he teaches
Then why do they require a degree at all? What specific skills will a degreed applicant have that a high school grad won't?
For one thing, the "skill" of (probably) not being from that stratum of society that doesn't get their kids through university. Asking for degrees is a subtle way to perpetrate socio-economic discrimination. (In some cases, it has to be a degree from the right set of schools, not just anywhere.) It's so easy! You don't have to look at race, or what neighborhood someone grew up in. Just this: do they have a degree or not. Saves time and protects from litigation: it's brilliant!
Getting a degree demonstrates some measure of discipline and work, and the delaying of gratification for the sake of a longer term goal that is several years away, while completing various tasks, jumping through hoops and so on. It's like a job. You have to attend to certain things on time, like showing up for exams, and meet deadlines (term papers, etc).
If you have a degree, you probably pulled an "all nighter" or two to submit something on time or prepare for an exam, and that's just the sort of dedication that employers crave.
I.e. one enrolls in a privately-owned, for-profit universities if they have the money, but don't want to work their arse off to get a degree, or don't have good enough grades from the secondary education to qualify for public unis.
Just as requiring a high-school diploma is a filter for the general education requirements that apply to a diploma, requiring a bachelor's degree in any field is a filter for the level of general education that comes with that, not the domain-specific skills of a degree.
And it's an imperfect filter, but its job isn't to be perfect, it's too reduce the absolute quantity of bad applications that need to be reviewed.
Skills which, as an aside, don't always have as high a priority with the "more practical" degrees whose use for a certain job is more obvious.
TBH I am far more likely to hire an English major with demonstrated technical skills than a technical person with a purely technical education.
Learning the technical skills to live up to the expectations of an entry level software or coding job is just not that hard for a decently smart person. I want to see evidence that you can think more broadly too rather than just checking the boxes.
Additionally hearsay is one of the worst kinds of evidence, scientifically speaking.
I wish there was more development into standardized testing... People pretend profs are altruistic and stop acting like people when they become profs.
My best classes were the ones where the professors and I came from wildly different places. There are bad profs, but in my experience it is pretty uncommon for professors to actually want to see you regurgitating their ideas. Moreover, most of my grading was done by TA's anyway so it's not even like the prof's take was relevant since the TA's have their own agendas.
More often the fact is that, undergraduates (and many graduate level students) suck at properly arguing their point. It’s not their fault really, they’re just less experienced at building a case, operating under a time and space constraint, and less knowledgable about the topic than the person grading them.
And when you’re being contrarian (going against your professor’s thinking) you’re likely to suck even more than usual because your professor hasn’t spoon fed you an acceptable conclusion, a cogent argument leading up to it, and first principles that they agree with to build an argument from.
>People pretend profs are altruistic and stop acting like people when they become profs.
Who exactly do you think would be designing (and thereby imputing their biases into) these standardized tests?
I only do this because I like teaching, so this is not my main source of income. I am not in the verge of homelessness because I do programming for a living. But the problem is still the same: Universities are charging astronomical tuitions and then paying a laughable part of them to teachers.
It does not make any sense, and IMO that explains in part why University education is losing its value: you get overworked and unmotivated teachers as a student, so you may better learn your stuff online.
IMO, this is what adjuncting excels at: supplementing your full time instructors with professionals spending most of their time in the workforce. This more or less happens in CS, but service departments like Math or English end up replacing the majority of their instructors with adjuncts working at 2-3 colleges.
Sadly, this trend does make sense and is easy to diagnose. Instructor salaries have stagnated despite increased student tuition because states have, for the past few decades consistently cut higher education funding. I was laid off from a self-funded university unit amid a very real concern that the state's failure to raise taxes to pay for pensions would ultimately impact the university's funding, and while this was not the immediate cause for the unit's layoffs, it was absolutely a factor in whether to cover the unit's budget shortfall.
I think you entirely missed what the comment said. The statement wasn't about there being anything "wrong" with getting an English degree. It was that there aren't enough jobs for English PhDs and comparably too many students getting them, hence driving the salary down:
>> The fact that the article features an English professor is unsurprising. This is a supply-side problem--liberal arts degrees are in relatively low demand, but institutions continue to graduate students at unsustainable levels
In the last decade, universities have shifted teaching responsibilities from full-time, tenure track positions to part-time adjunct positions. Today, half of all teaching positions are part-time and include no benefits. Adjuncts are paid an average of $2,987 per semester-long course. That's less than $1000 a month. If you've ever taught a college class before and realize how much preparation it actually takes, an ostensibly "part-time" position easily demands 40-60 hours of work a week.
In some fields, over-supply is helping drive this trend. We produce more History PhDs than there are jobs to fill. This weakens the bargaining power that academics have on the market. But this does not absolve universities from their role in the immiseration of academics – after all, they continue to accept more PhD candidates with full knowledge that most of them will never find gainful employment in their field. This is not surprising given the fact that departments now rely on _graduate students_ to teach one-fifth of their course loads.
The "casualization" of post-secondary teaching is also a disservice to students, many of whom don't realize that the bulk of their instruction is now done by overworked, underpaid adjuncts who don't have the time or incentive to do their best work.
I think that some fault lies with unrealistic expectations. Every liberal arts graduate student knows that there are only enough positions for 1 in 10 of their cohort, but each firmly believes that they will be in that 10%. Worse, many who are talented enough to actually be in that 10% self-sabotage by being unwilling to move to locations they perceive as less than desirable.
I once had a professor try to convince me to go for a bio phd. I objected that there's only one TT position for every 20 graduates. His response was that it's just about hard work - look at him, for example!
He didn't get my point; I didn't get the phd.
Same for other "liberal-arts" degrees.
Because that would be like saying "hey, there is a big white elephant in the room!" to the room's occupants, when there is indeed precisely such a beast in that room.
Such as proclamation is rather more befitting of, say, the Economics faculty.
Humanities students do indeed go on to earn less, but the differences are much more slight than stereotypes would have you think. They also narrow as people progress in their careers:
There are indeed too many PhDs minted in the humanities, but I'd say this also applies to pretty much every field other than, perhaps, CS. Times are pretty tight for bench and social sciences as well.
It's a demand-side problem. Most higher education is funded by state and federal government, and they have been cutting funding dramatically, reducing demand for skilled academics.
It doesn't have to be that way. The U.S. could fund higher education at the levels it used to, and fund reasonable incomes for the higher ed workforce. To create a less educated population than the prior generation is to go backward.
This isn't a new phenomenon, but it's a phenomenon that's becoming more acute. Even if we ignore the adjuncts, the brilliant folks who taught us all what we know subsist on less than half of the median wage for our professions, with increasingly mediocre benefits.
These are the same folks who's amazing research work is powering our industry, often with a 10 year lead time.
Lately, I've been finding myself frequently thinking: "That's Texas."
Let me get this straight. Take a course load of 4 double semester courses per year with 200 hours of class time total in the given year. Let's double it to 400 hours just to be safe to include exam marking and office hours. A single student paying $30k per year is paying $75 / hour / student, or an annualized salary of $150k / year.
True, you can't work all year round, but the average adjunct earns less than a single average student's tuition?! Where does all the money go? How is the market so broken?
At the end of the day, I make less than minimum wage teaching these students. I'm lucky that I have another job that supports me.
Colleges and Universities are not lean organizations. My department has significant amounts of overhead - people that in some cases work full time doing administrivia or extra paperwork and they require a salary. The buildings and labs also require upkeep and technology has a fairly high turnover cost. Add in the racket that is software + books and things add up quickly.
I'd estimate that my department has almost a 1 to 1 back office staff to professor ratio - simply due to the lack of automation (and no desire to automate) many portions of the work that is done at the school. Many of the back office staff are full time and they receive a regular salary rather than a flat fee per student.
EDIT: Another unrelated anecdote: I was recently talking to a PI at Rice's Engineering program who had just finished up his spring class. He got rave reviews of the class, all the undergrads really liked it. About a week after the class reports came out, his director came to him and told him that he was teaching too well and that he should be spending more time on writing grants. If his classes liked him that much again, he would be disciplined (I've no idea how, but he was nontenured, so it sacred him enough). He then taught a 'worse' class in the fall. Whatever those students are paying for, it sure is not the education.
Don't students have to pay for software and books? Where I went to school the library didn't stock any textbooks, and there was an extra per-credit hour "technology fee" slapped on to the regular tuition? You'd think the "technology fee" would cover all required software, but it only really paid for things like Microsoft Office, Peoplesoft, etc. Specialized software had to be bought separately (glares at Adbobe) at the student's expense.
Don't students pay for those separately? How does that add up for costs on the school's end?
The real reason the adjunct-professor position pays so low is because that's the value applicants are willing to accept / that's the cake society assigns.
If that class of employee demanded, or held more prestige, we would see efficiency gains elsewhere in the business.
Look at the housing market: in the 70s you could buy a house for much less because houses cost less because people didn't have more money to overpay. With a thirty year explosion of the mortgage industry and the idea that anyone should qualify to own a home, suddenly consumers had extra money to outbid people on buying houses. This broadly increased property prices with no signs of slowing (unless the money runs out, as per '08). All of that debt is increasing people's ability to buy (raising demand, increasing prices) but not increasing their ability to pay (suggesting some kind of time limit).
Now look at the education market: in the 70s, you could buy an education for much less because people didn't have more money to overpay. The cost of education was in line with normal market mechanisms. But with a thirty year explosion of student debt and the idea that everyone should go to college and qualify for loans if they can't pay for it, prices broadly increased.
The problem is, for many, there is no alternative. "I'm the first one in my family to be able to go to college." "Not going to college makes it impossible to find a job." "I have to get an education." When "No thanks, that's crazy, I'll do something else" is not an option, prices are going to do crazy things. Perfect example being healthcare costs.
I think "Don't go to college" is the best advice today. Learn to program early, start working as a freelancer when you're 18 and graduate from highschool. By the time your friends graduate from college with a bunch of student debt and can't find a job because they don't have job experience and employers want to hire people that know what they're doing, you'll be making and saving a lot more money freelancing with 4+ years of experience under your belt already. Experience is waaaaaay more valuable than degrees. Might as well get it early.
* 80% fewer lawyer jobs because algorithmic contracts will kill the rest of the lawyer jobs that e-doc review didn't kill already. The 20% that remain will be lawyers/programmers.
* 80% fewer medical doctor jobs because all surgery will be done by robots because it's safer, radiology will be automated because it's more accurate, and most diagnoses will be automated through DNA analysis and readings from wearables. Doctors will need training in presenting (quite possibly bad news) results to patients and training to run the machines. The 20% that remain will be doctors/programmers.
* 80% fewer jobs in "driving vehicles" or "packing things" (truckers, factory workers, Uber drivers) because self driving cars will be mature 20 years from now.
* Restaurant jobs will still be around. But the cooking will be automated for 80% of the food sold in America.
* Business process automation paired with private equity firms and competition from startups will automate away a surprising amount of managerial roles.
* 80% fewer retail jobs because Amazon has helped make it so you can just walk in and pick something up and walk out and be charged for it, without any human involvement.
20 years is the difference between 1994 and 2014. I might be wrong about some of it, but I think people underestimate how much the job landscape is going to change and how important of a skill "programming" (which means a lot of different things) will be. Everyone should learn to program. And basic income, but that's a whole different can of worms.
Interesting; this contrasts with the claim of liberal arts graduates who maintain that simply being competent critical thinkers is all that's required for almost any job.
Programming requires “training and education”, too.
But for many jobs, especially if you have connections that let you bypass filters designed to winnow the applicant field down to a manageable number that throw out lots of otherwise-atrocious candidates, you can demonstrate that other than by the degree that is “expected” for the field.
Computing is by no means unique in this respect (and the big players in computing are as much known for near-total exclusive preference for top schools as is the case in the worst of other fields.)
Are you implying that you don't need skill and knowledge to program?
Or that it's impossible to get skill and knowledge needed in "other reasonable paying jobs" mostly outside of formal education industry?
That only works in industries like software etc. If one wants to be a lawyer, doctor etc they have no choice but to go to college, right? Some American students are going abroad (Germany etc), that could be a short term option.
For lawyers in some US jurisdictions, including California, no; California, for instance, requires general education of two years of college or “demonstrated equivalent intellectual achievement” prior to study of law, and several options for the study of law, including what amounts to apprenticeship in law office or judge's chambers. So traditional college or law school is not strictly required.
The top law firms employ small percentage of employed lawyers. And not even a huge percentage of graduates of top law schools. And aren't even the kind of work some people going into law want to do.
Confusing “what you need to get a job at a top law firm” with “what you need to be a lawyer” is like confusing “what you need to get a job at Google” with “what you need to be a programmer”.
> And the legal field is generally oversupplied with lawyers so if you don't work at a top law firm you won't be making a great living for a long time.
If you avoid undergraduate and law school debt (and especially if you are a paid employee instead of a paying student during your legal education), you can make a great living at a substantially lower salary than if you don't; and olif you do a legal apprenticeship, you come out with real-world experience most law students won't have (and quite often a job in the office you apprenticed in, if it was a law office and not a judges chambers.)
If you don't get an apprenticeship, then your options after passing the bar with self-study are essentially (it seems to me):
1. low-paid government work (like public defenders)
2. setting up your own practice (hard to build up a client base initially)
3. working on contract/temp basis for another law firm with variable prospects
Law school isn't just about credentialling and signaling; it also sets you up with a valuable alumni network that helps you get work. Anecdotally it's easier for an aspiring programmer without a network to get a referral into a tech company; go to a few meetups, hackathons or conferences, build a portfolio/github, meet a few people and ask them if they'll refer you (I've both gotten a job and helped someone get a job this way). And very few people ask or care about where you went to school when they can see your work instead. Again I may just be biased because I know the tech field better but it seems to me that it's harder to demonstrate your legal skills since you can't exactly post a sample contract or legal opinion you wrote, on Github (or maybe there's a portfolio site for lawyers too, who knows).
> Confusing “what you need to get a job at a top law firm” with “what you need to be a lawyer” is like confusing “what you need to get a job at Google” with “what you need to be a programmer”.
Again, not knowing the legal field well it seems to me that even mediocre programmers do better than mediocre lawyers because of the supply-demand differences between the two fields.
There are very small numbers of people who choose that route (and very low awareness that it even exists); as I understand, there are mere dozens actually doing it any time in California recently.
One of the more common routes, reportedly, is people who are already employed (e.g., as paralegals) in the office they become apprentices in.
and getting into medical school is basically luck of the draw; there are far more overqualified applicants than necessary. thinning that herd a bit isn't going to hurt.
And his lectures are available for free.
If you get a degree, lots of what you learn will be outdated fast. The most important things you learn will be on the job actually making things.
> Unless maybe you are going to a top 1% CS program, there is little need for a degree in software engineering.
I think you missed what I just said. You say there "is" little need. But I'm not looking at the present. We still don't know if we do for the long-term. (Well, lots of people think they do, but they're only guessing, not going on evidence obviously.)
Note that it's simply not true that people would learn the same material independently that they would learn in a classroom. Yes, it is theoretically possible, and some people can pull it off, but it's not at all universal or even common. A lot of people need the environment to learn better. Online lectures just don't cut it for everyone. So yes, "you need to go to college" can be true regardless of how good online lectures are, and we still don't know if what people get out of a CS degree will on average give them an advantage compared to those who don't get one in the longer term (though I don't really have statistics for the shorter term either).
That's not always true. Most people seem to think they "need" a car, yet the price of cars isn't astronomical.
But you're right that even though the cause is noble (education available for everyone) our current mechanism of funding it results in runaway prices and for-profit scams.
I don't think everyone should not go to college and learn to code though. I think in the near future coding will be a basic piece of literacy, but only one piece. I think it's reasonable to imagine a future where just like people study english and learn to do some writing in english 101, they will also learn the skills to munge a csv file or write an sql statement in programming 101, and both will be part of a well rounded education.
The last thing this industry needs is a flood of high school graduates creating shitty insecure websites. I say this as someone without a degree, who knows first hand that although it's possible to make it w/o a degree, it shouldn't be anyone's first choice.
Cars are not subsidized in the same way that healthcare or education are subsidized, though. There are fancy, expensive cars for people who want/can afford them, and there are more modest cars for the rest of us. There are lots of manufacturers of cars from around the world (and those cars from foreign manufacturers are available locally).
Of course, there are fancy expensive schools, and cheaper schools, but the subsidies (or maybe some other force I'm unaware of) seem to make the 'cheaper' schools still get more expensive every year.
As to how to fix it, it's complicated. If the fed wants to continue to pump money into universities, they could federalize them, but it would be a huge transfer of power to transfer control of universities from the state to the fed. There's accreditation, the board of regents, and a variety of other organizations that would then be federally controlled. On top of this, since a university is such a large piece of a local community and requires infrastructure to support it, it makes sense to manage it locally. Further, the state would probably just play the same game if a university was federally controlled and still milk money. Really, the states need to realign their priorities and increase funding for the university system if they want stuff like this to stop. Removing federal student loans would help alleviate some of this pressure, but there's still pressure on universities to move to adjunct faculty to reduce costs as well, which is what the article was about.
More money entering an economy will certainly allow buyers to bid up prices (as with houses and college tuition). The question is which other prices are being bid up even more rapidly and which are remaining flat or declining on a relative basis.
These sorts of problems are why both illicit and legalized counterfeiting are so pernicious.
According to this article  in Washington Post, the first occurrence of Texas as "crazy" in Norwegian dates to 1957. Makes me wonder if it's used this way in other European countries too, and whether or not the etymology is independent.
It's worth noting that unlike the term "Amerikanske tilstander" ("American conditions") in politics (as shorthand for implying your opponent is pushing for whichever stereotypically bad thing about American politics fits best in the specific policy area), the use of "Texas" refers to wild west movies, and so is about 19th century lawlessness, not modern Texas (though there certainly is still a stereotype of Texas as being at least, well, somewhat Texas)
Exact use of the term could have certainly evolved over time, of course.
Maybe you should switch to saying "That's Illinois" or "That's Chicago", they're the dysfunctional state now.
So no. The fact that Texas can fund state schools of reasonable quality with the spoils of these actions does not dismiss the fact that it's one of the most frustrating states in the union from the perspective of modern education and industry.
Why is the US science education so bad? Texas. Almost exclusively. Its primary opponent: California. Texas causes similar problems in many other fields, including patent law and taxes.
If I had the option to go to the school without paying for the team, I would have certainly done so. Though the school was still 'worth it' but dang it, I could have had more consumer surplus.
Another issue (probably not the first piece of fat I'd cut) is title 9 sports - schools have to make sports programs accessible to women and men and they most commonly do this by creating women's and men's teams. I don't have a good understanding of the law but there might be a better allocation of resources if they simply opened teams to both sexes rather than creating sexually differentiated teams.
Sure. I wasn't endorsing the system, just explaining it as I see it.
> schools have to make sports programs accessible to women and men ... might be a better allocation of resources if they simply opened teams to both sexes rather than creating sexually differentiated teams.
I think that only works in a world where women and men are both physically capable equally in all aspects. That's not reality though, and at the high performance levels these teams play at, that difference is likely exacerbated to the degree that for many sports mixed teams would really just mean a starting lineup of one sex, which is not exactly succeeding at making it accessible if that's the goal.
Interestingly, if there were enough variation in the peak mental abilities we might see a similar segregation at peak performance levels. I suspect that while there is evidence that men and women often have different mental aptitudes, that's on average and at peak there is little difference in potential (even though there may be a difference in occurrence). E.g. There's plenty of evidence and reason to believe the strongest person that ever lived in the world is male, but there's not a lot of evidence to believe the smartest person that ever lived in the world, or even in any one field, is male (even if it may by likely to currently be one sec or the other based on ratios of interest).
I get that women and men have different physical capabilities but why split it on gender? Why not split it on race? There isn't much racial equity in some sports, apparently some races can't compete as well as others and the demographics speak for themselves. Better yet, why not split sports by work ethic? I'd be more inclined to do sports if I could play with people who are similarly disinterested and the status quo isn't very accessible to me: a person who over eats, sleeps in, and forgets rules. I'm sure we can map that to a disparate impact somewhere.
Or ignore the discussion entirely and say "we'll let colleges decide how to handle their sports programs".
I'm confused, I wasn't making any case for that. I'm not sure where we got to Federal funding tied to specific program attributes.
> I get that women and men have different physical capabilities but why split it on gender? Why not split it on race? There isn't much racial equity in some sports, apparently some races can't compete as well as others and the demographics speak for themselves.
Do they, or is that primarily a matter of socioeconomics and class providing less traditional opportunities for certain ethnic groups, leading them to funnel effort in alternate areas, such as sports? It could also be a cyclical system, where many role models for a race are currently in specific areas (such as entertainment and sports) influencing newer generations. I wouldn't be comfortable attributing it to genetic variance of physical capability without a bit more info.
Federal funding coerces schools into offering women's sports programs.
> I wouldn't be comfortable attributing it to genetic variance of physical capability without a bit more info.
Why do we have to attribute the difference to genetics or physical capability to make sports programs accessible? What about mental capability? Mentally impaired, physically okay individuals compete in the special Olympics. Why not create a class for people who are using performance enhancement drugs?
No, leaving aside whether funding they aren't forced to take can coerce anything, it mandates equal sports programs. They don't have to offer women's sports programs, if they don't offer men's sports programs.
It used to be that masses of factory workers were required, so those people were carefully set up for wealth extraction until they unionized and did an organized rebellion.
America's knowledge workers are where America's value is, so economic yokes have been installed at the most profitable point: where people receive higher education.
Honestly I'm not sure the EU is THAT different, I think you just do it at a different time.
I've got friends who are working on their PhDs while teaching 4-5 classes (at other locations). I considered being a TA a few years ago, but you couldn't have a second job. The stipend as a TA was $1300 a month + tuition for 40 hour weeks (may have actually been "20" hour weeks, not certain). Yeah, right.
It's pretty screwed up. It's a privilege to be in higher education but it is also pretty crushing and you've got to be a soldier to deal with it. The worst part is I've almost never met anything but great people who willingly put themselves through it.
It's why as an overeducated graduate student, I'm against expanding education more and more (like free community college for all). The incentives are way off currently, not sure what the solution is, but it has to change.
Sure, I wouldn't start a family on that, but it far surpasses what I lived off of right after college.
EDIT: It looks like the article was about people late in their careers. I was assuming that adjunct faculty positions were only entry level or for grad students.
They do currently include decent health insurance, at least. No dental/vision, obviously.
Nope, still gotta pay rent. Attended Columbia for grad school, had to compete in the Manhattan real estate market on < 30k a year for an apartment.
> They do currently include decent health insurance, at least.
Ehhhh, it was of the 'if I get hit by a car and I am in the hospital for two months, I won't go bankrupt' variety. Better than nothing, at least.
Probably school-dependent -- I'm aware of one that does have dental at least (I forget about vision but I think it has that too).
> It's pretty screwed up. It's a privilege to be in higher education but it is also pretty crushing and you've got to be a soldier to deal with it. The worst part is I've almost never met anything but great people who willingly put themselves through it.
Isn't that selecting for exactly the kind of people they want in academia though? Great people who would willingly put themselves through pain just for the sake of advancing knowledge? I'm not actually convinced that if they paid a better salary it would necessarily end up selecting for people better suited for academia -- not saying this because of lack of skills, but because if you're worried about making money the moment you start, it seems pretty plausible that you (or far people in your shoes than would feel so otherwise) would be distracted by it the whole time and not working purely for the sake of advancing your science.
Surely there's some middle ground between incentives that attract too many people and underpaying dedicated talent? Even if you were to get paid more, the work is so intense and extended (grinding for years), those just attracted by money would have more attractive alternatives.
That being said, with the nature of being a graduate student and being on campus always, if it was "20 hours" a week it was really 30+. I do have a separate university related job now and the hours are anything but consistent. But you may be correct.
The truth is, the brilliant professors are surrounded by mediocre professors, who in turn are dwarfed by the numbers of administrators and staff. At the vast majority of universities you are unlikely to get taught by anyone especially brilliant.
A university education was a bit of an expensive luxury in the past, now with the internet it's far more true.
Adjuncts are a different story.
I taught as an adjunct part time at a state school where professor made $150k. I made $5000 per course. I did it more of less for fun because I like teaching. Which is not bad for extra pay on the side but terrible if it is your only pay.
Edit: as a side note, the department head told me she fought to have adjunct Comp Sci professors make more because the field is in such high demand. I am sure English adjunts at the same school make far less.
That's for adjuncts though. I think for full-time professors they have PhD requirements per the accreditation boards that they need to maintain. But once the quota is met they can hire anyone.
The article gives a median income of under $50K for non-adjuncts. Of course, that includes assistant/associate professors.
I doubt it is because of associate professors. More likely it is because of the size of the school, location, and type. I bet there are a lot of community college students in there.
A quick look in my area (New England) shows associates starting at about $60k and going up from there. But New England has a high cost of living so it is all relative.
Tenured professor at some small liberal arts college usually isn't doing as well.
No they're not. STEM professors get paid quite well.
For comparison's sake, here are what physical education teachers in Palo Alto Unified make:
Here's San Jose Unified teachers (they don't seem to have P.E. in the job titles):
Not singling out P.E. teachers or K-12 in general -- look up any Palo Alto area job type (police, technicians, etc), and they're all in the same range. And living in this area myself, I completely believe that scale of salary and benefits is needed if you want employees who can live within 50 miles of Palo Alto.
Obviously, people shouldn't need to sleep in cars. But the answer to that is a basic income. It's not treating a specific group of people specially just because they have college degrees and are more relatable to the elite than folks who work at Wal-Mart.
In a time of bullshit thought leaders and companies like Equifax who will tell you obvious lies, the academic discipline with its rigor, criticalness, and policing against self-promotion is a kind of last-bastion for intellect in the West. And even that is sort of cracking.
BTW one country has already tried the “you can only get useful degrees that have immediately available needs route”. It was the Soviet Union, in which everyone was an engineer, to such absurd levels Gorbachev got his degree in “Steel Metallurgy of Ball Bearings”.
So does water, but that doesn't make it valuable; relevant question is the value of one more teacher. Given the extreme oversupply, the answer is "not much". Markets are speaking; it's up to workers to listen.
To be sure, you can make the case that academia is underfunded for these roles; but I estimate that you'd have to spend like a fourth of GDP to get a reasonable salary for every qualified academic who wants to do that. See this previous HN discussion on the matter:
This is true, but it's not terribly efficient about doing so. Much of it provides little to no useful return, and some of it actively harms the world we live in. If we wish to talk about the value of academics, we must honestly look at the whole balance sheet. Value isn't just about revenue, but costs, as well. And with rising tuitions and falling class quality, the ROI of a college education isn't as high as it used to be. For students and for society in general.
> the academic discipline with its rigor, criticalness, and policing against self-promotion is a kind of last-bastion for intellect in the West.
I'm really struggling to word this in a polite way without discarding facts (sorry if I'm rude), but, what kind of fantasy led you to believe this is or has been the standard for academia?
- Academic discipline is more about knowing how to file papers and abuse TAs and post-grad students for free work, rather than about following scientific processes. There's been truckloads of articles lately about insufficient scientific rigor on published papers - especially in the social sciences. The glut of students, professors and information has lead to more noise than signal. Self-referential theories get presented as fact, where papers reference other papers still in peer review. Student thesis subjects are encouraged to focus on the professors' work in order to increase citations and prestige. Controversial papers are encouraged instead of scientific papers, which has caused all sorts of problems in academic journals and the parasitic journalists who write clickbait from them. And then there's the ever-present massaging of data and discarding of any contradictory samples.
- Self-promotion is absolutely huge in academia and has been that way for decades. That's how "publish or perish" came to be a staple of the industry. Not to mention the prestige factor in selecting mentors and advisors, politics in academia between people at the professor level can get extremely vicious. Elite oligarchies are as fixed in academia as anywhere else. And if they can't teach, they just move into administration and promote themselves there. Administration is probably the worst part of a modern educational institution - full of waste and corruption. Like how Katehi got rehired at her chancellor-level salary after a year's vacation after that pepper spray incident at UC Davis. Academia is literally infested with bad actors. Not to say there aren't good ones, but the bad ones are especially well connected and hard to evict. There's no policing against it except against the people who get caught before they're successful at entrenching themselves.
- Critical thinking isn't even remotely a curriculum requirement anymore. Many classes actively discourage critical thinking, and instead encourage rote memorization or directed analysis instead. In some classes, if you dare to challenge the narrative presented, you might receive Title IX sanctions for your oppressive actions from students and professor. After which you'll then be brought before a panel, denied representation and judged by a biased group more focused on maintaining image and federal funding than on the truth. And this might be from something as small as questioning the statistics or the sample set from a study. Students are encouraged with safe spaces and other policies that prevent them from processing or even seeing opposing viewpoints. That is not "criticalness". It's pandering.
- Deceiving students in academia is also a thing. Once again, there's the whole problem with scientifically bad papers being encouraged, published and referenced without peer review. These things make it in to curriculum and don't get pulled out after a retraction. Some classes will teach you that you shouldn't critique at all. Class books are often written by the professor to supplement their income, and contain their own pet theories. They put out a new edition every year with minor changes just so you can't buy a used book. And you've obviously never sat through a lecture rant on the professor's pet grievances. Lecture after lecture on the evils of western civilization from a person who literally couldn't survive outside of it can really tire you out. And honesty about career applicability for degrees is at an all-time low. Partly because some of the people teaching you have limited career aspects themselves and don't like to stare the facts in the face. Dishonesty about the value of the information presented and its critical reception is kind of the worst kind of dishonesty when you're charging someone a year's salary to listen to it.
There are good things about a university education. And not every situation is this bad. But it is not the ivory tower you're perceiving, and likely hasn't been that way for the majority of your lifetime, if not all of it. The honest, naked pursuit of science and truth always been the ideal, but rarely the reality. We don't want to throw the baby out with the bath water. But we've got to be more honest and objective about problems in academia if we have any hope of fixing them.
Let me add one more point, and see if what I said merges with what you said. If not I'm completely wrong.
A lot of books, like The Idea Industry by Daniel Drezner and also Science-Mart by Philip Mirowski say that the cutting of university budgets since the late 60's has led to such an undermining of their stability that Academics have had to engage in a stupid, pointless, survival battle of numbers, conformity, and politics. I'm sure the politics was there beforehand, but they make a convincing case that critical thinking and independent research is much more possible when you're not afraid of losing your funding all the time / scared in general about your job.
An additional point to this is that a lot of individuals see the cutting in college funding as the GOP's revenge for the 1960's, because a lot of the behavior of that era was seen as coming from college campuses. UCLA was actually free up until 1967 when Governor Ronald Reagan began the process of charging tuition.
So in conclusion, academia does have a lot of terrible things about it, but I think say 60% of it is fixable simply with a better relationship and funding style between universities and the government.
If academic discipline really taught rigor and critical thinking, do you think the subjects of this article would be putting themselves through such horrible experiences?
From what I've seen, critical thinking (which I do regard as a highly desirable capability, on a personal level) has to be acquired outside of any established institutional framework. An institution which actually teaches critical thinking is at a disadvantage in many circumstances, because it reduces compliance and hence organizational strategic focus. That disadvantage means that well-established institutions almost always have ways to shut critical thinking down. Those which don't tend to be out-competed by those which do.
- There are far more qualified people than could ever be eaten up by the demand for them.
- In practice, most workers of this type can't make it their only job.
- Though a small fraction at the top are extremely well compensated.
- Unionization can allow workers to capture more of the surplus (and has, for actors), but the oversupply is so extreme that it still wouldn't make the job a viable career path for the typical qualified applicant.
My previous HN exchange on this point: https://news.ycombinator.com/item?id=10964421
Here we go again. Please show evidence that it works. Oh right, there isn't any. Please point to a reasonably complete economic model that shows it will work.
It contains extensive citations, and costs about twelve to seventeen bucks on Amazon, depending on whether you buy it as an e-book, paperback, or hardback. If you don't wanna spend that much money, buy it and send a picture/screenshot of the receipt to email@example.com, and I'll cover it for you.
In NY, homeless people cost the city over $40,000 per person! And there isn't even much to show for it!
Wars on and Poverty and Drugs have spent billions, also without improving living conditions of the most vulnerable economic group!
At what point can we say enough is enough?
And here is some proof for you: Experiments in 'just give people money or housing' have proven more successful than traditional social services(1,2)!
BI is more fair, more efficient and has an added benefit of removong incentives for crime or other deviant behavior.
Besides, it is more morally just. We are all here in this world where most of the wealth was created before we were even born. Everyone is entitled to at least a small portion of it.
*Depending on which programs you include in this category, namely SS, medi- caid and care.
Giving it to everyone elsE besides those in need would just be like a progressive tax break.
If your concern is it will disincentivize work: it is true there will be some drop in the labor participation rate. However, that drop is unappreciably small in comparison to the amount of corruption and number of total jobs in the economy that do not add any value to society. If your concern is that "people won't produce things we need," you should be focused on bankers, healthcare lobbyists and insurance companies, a significant part of corporate America, academia, government, teacher's unions(see: 1), etc.
In fact, removing the above corruption in the labor market and replacing with BI may actually increase incentive to work by opening up opportunities for innovation, and creating good faith in governance within the labor market.
The internet will cut all middleman, and teachers being the middleman between you and knowledge, or gatekeepers to degrees. If people do not add value - and empty classrooms show many teachers do not add value, in my theoretical CS and math classes professors only wrote proofs on the blackboard for 1.5h and then left - the internet elimates you.
It is not enough to just have a BS/BA in a STEMy field, you need the MS as well. I think it's fair to conclude that the credentialing is then worth less at the undergrad level.
Online courses are great, but they don't replace everything. I don't think the future of universities is bleak. The explanations above considering that we have a surplus of Ph.D.'s seems to be a lot more likely of a cause to this. Simple supply and demand.
The MOOCs don't work well because they are 100% virtual. What they need is a bit of in-person, human touch. I'd like to see a system where online courses are being supervised by "educational coaches" who are not necessarily experts in the field, but experts in motivation and maximizing results.
Imagine a remote location, where there are no universities. A number of people could take completely different online courses there, monitored by the same coach. The role of the coach is mostly to witness the effort of the student (great for motivation) and counsel the student as to how to apply his effort most efficiently. The coach could help organize local study groups as well, if there are enough students. Technical questions could be handled by online forums and the MOOC staff. The training of the MOOC counselors could be a MOOC course itself, to spread the system organically.
And when that happens, it will open questions about the antiquated college admissions processes of said universities, which evolved from the physical limitations of campuses. Who is the more ideal job candidate between person A who was admitted to an Ivy League because he/she had the know-how and resources to play the college admissions game vs. person B who didn't, if person B is able to get higher grades on said Ivy League coursework? This will no longer be a hypothetical question when MOOCs have matured. This effect will put downward pressure on tuitions, which will be a good thing for students, while at the same time enriching the handful of colleges and universities with enough of a brand name to make it in the MOOC world.
That effect only seems to be growing. That said, certainly do agree with you in principle.
Penguin, PwC and EY have already dropped university degrees as a requirement.
"The move comes just months after accountancy firm Ernst & Young, one of Britain’s biggest graduate recruiters, made a similar announcement, saying in August that it would no longer consider degree or A-level results when assessing potential employees"
"It found no evidence to conclude that previous success in higher education correlated with future success in subsequent professional qualifications undertaken."
"Goldman last year also made other moves to help it identify strong candidates who may not attend Ivy League schools by scrapping first round interviews on college campuses in favor of a video platform."
I adhered to Aaron Swartz not degrees when hiring
The whole prestige of academia isn't access to information: internet has more academic content than
any university(e.g. Sci-hub, Arxiv, book sites, online courses).
Universities own vast collections of lab equipment/tools/devices which universities can afford due economy of scale(serving groups), this "lab-grade" stuff is out of budget for most people.
Of course there are areas where research can be done with older, cheaper and simpler equipment, but cutting edge science is confined to top-hardware owners:
An example is amateur astronomers having much less capable telescopes and recording equipment, but still capable of advancing science with affordable devices.
This is not the case at all for many fields. In my branch of linguistics, for example, the majority of important literature is not available online. A person cannot teach themselves this field; you need access to the print resources in a large library at a handful of universities. Now, since the dawn of ebook sharing sites, my colleagues and I have been gradually scanning resources and uploading them to Libgen or the like, but progress is slow and we have barely scratched the surface.
Weird, my professors answered questions that I and my coursemates had.
Even here in NL there are niche websites that pretend to be dating sites but that actually are sites where students are hooked up with 'sugar daddy's' effectively prostituting themselves to be able to finish their academic education.
This is a hot topic in the news here right now.
I thought in The Netherlands that higher education was effectively free.
I hadn't heard about said hot topic, but one of the reasons might be that recently the government stopped remitting part of the student loan as a gift after graduation.
They're all over the news, probably the best unpaid marketing campaign in a long time.
But life is not.
This is the future for all of us, when programming becomes commoditized just as teaching has. When our salaries are pushed to the bottom. We are not owners, we are not capitalists, we are not bosses, we are people with a skill you can learn on the internet and a corresponding talent for it.
Either a) this will happen to us as programmers, b) we have credentialing and gatekeepers to keep supply low c) we have a union to collectively bargain, or d) radical changes in the government save us from this fate.
In the U.S. (d) seems impossible. Programmers as a group seem to be virulently opposed to (b). So choose, unions or barbarism.
Teaching salaries are low relative to education level because lots of people are socially pushed toward teaching (obvious example: many people's mentors are teachers). Programming doesn't have this dynamic.
I'm not opposed to unionization - I joined the new grad student union when I was getting my PhD. But I think you should focus on programmers' specific problems, like long hours and low vacation time, not on problems from other jobs which have low relevance.
Programming and teaching are similar; they are prestige positions (for now) that mark you as one of the professional class, people who do them are by and large hugely passionate and would be unhappy if forced to do something else (many programmers I know started before school because they loved it, do it in their spare time, etc). Passion in capitalism gets taken advantage of and exploited, since dispassionate economic assessment is how a 'rational' actor works in economic models; passion is a weakness from the perspective of wealth accumulation and economic success. The thing that links the fields in my view is passion.
If you need to qualify to join a union, or to do a specific job in a union, the qualifications will be owned by the companies which are politically powerful.
Imagine being unable to qualify as a C++ programmer unless and until you've qualified on Microsoft Windows and Visual C++.
Imagine being unable to qualify as a Perl programmer at all, because Perl isn't one of the technologies owned by a major corporation.
And, of course, doing freelance work, or contributing to Open Source, makes you a scab, stealing work from Poor, Honest Union Workers.
Unions can be for unskilled labor. They can exist without qualifications and gatekeeping. That's why I presented that as a distinct option, separate from them. All they need to do is bargain with employers to make working conditions better. That's it. Unpaid overtime "just this once because we're all part of the team, guys" every month? We strike and make a deal that says you have to pay us OT. 80 hour weeks on salary for 40? ops strikes and your system goes down and nobody fixes it until you make a deal and cut hours. Employer tries to put a clause in your contract that says they own your side projects? They have to go through the union first, and programmers vote hell no.
The rest of the stuff you've cooked up inside your head are sure things I can imagine but they have nothing to do with unions; they are imaginary FUD.
Otherwise, by using GNU Emacs, you're depriving a union worker of their wages, and, since GNU Emacs gets updated even when there's a General Strike on, contributors are scabs, and union supporters have, historically, killed scabs.
I have never heard of, read, talked to, or even imagined that any living person would consider contributing to open source to be "depriving a union worker of their wages" or "scabbing". Have you ever even heard of an actual existing union before? Have you ever known someone who has participated in (well, let's be generous: imagined) a strike? Has any union ever banned its members from donating their free time to charity? Are you listening to yourself?
The union propaganda campaign must be working.
This is true insofar as setting a floor on any of wages or working conditions is, strictly speaking, a supply restriction.
> which would destroy Open Source
No, it wouldn't.
> Otherwise, by using GNU Emacs, you're depriving a union worker of their wages
I'm a member of a union that represents programmers. None of our contracts restrict the use of open source (or even paid off-the-shelf) products by the employer.
Contracting out custom programming work isn't even prohibited, though it is redtricted.
I think this reflects a larger trend.
Most institutions are taking on the rituals of the dominant organizing principle in society: the corporation.
I believe this is why church groups are having "annual general meetings" and that formerly not for profit institutions are mimicking the organizational structure of corporations: meetings, hierarchical teams, layers of administrators etc...
Walk into a meeting, look around. Add up everyone's per hour salary. That is what having that meeting costs. Now look around and consider what each talking about the relevant bits to everyone else would take until a decision was made without the meeting. That's potentially how much more not having the meeting costs.
You'll conclude two things. Meetings are crazy expensive for the organization. Meetings can be worth it when the alternative is even more crazy expensive. But the meeting should be as efficient as possible. And you want people to only be in them when there is positive value from doing so.
I just ran a standup. I do this every day. 8 people for 15 minutes. Let's suppose their average salary is $50/hour. That's a $200 meeting. I spend another 30 minutes per day summarizing it, making notes available, and elsewhere keeping a status page up to date so that other people don't have to ask about that meeting.
That extra work keeps another 5 people from showing up. That keeps me from having to have another set of meetings with those people. That keeps the meeting from ballooning to 30 or 45 minutes per day.
In a good organization you want everyone thinking about meetings like I just did. Yes, lots of meetings are needed. Yes, important people will spend most of their time in meetings. But make it efficient. And eliminate any meeting, and any person from any meeting, that doesn't pull its weight.
Organizations should be run with the necessary minimum of meetings, hierarchy and administration. Universities have entirely lost sight of that. And our response should not be a complacent excuse that this is how it should be for any organization of that size. Our response should be to call them out on being severely incompetently run, with no excuse other than the self-aggrandizement of useless bureaucrats.
Big corporations do require bureaucracy, but smaller entities use bureaucracy as a substitute for competency and autonomy.
You can find smaller scale versions that operate on similar principles. https://hbr.org/2013/11/hierarchy-is-overrated
As to Universities Students already self organize around major creating an internal market. Some Universities try and limit the sizes of various Majors, but that's far from necessary. You generally allow professors to seek outside funding for post docks etc without command and control oversight.
You could use internal markets to make various decisions such as funding allocation. Though self organization is often just as efficient.
Net result, you can easily have a University without a president/CEO. Their are even plenty of historic examples of this.
So, it looks like a duck and smells like a duck to me.
My experience comes from presbyterian churches which have some highly analogous features to corporations deriving from theology and predating by thousands of years the modern corporation. Thus there is a annual "Congregation and Corporation meeting", which is really two meetings for convenience held at the same time but which are formally started and ended separately. Participation is almost identical save that the "Congregation" can have communicant members that are minors and therefore not legally voting members of the earthly corporation.
At my alum mater, the state cut support to the state universities to practically nothing. Voters saw how much money the big universities were making on college football. They consistently voted to cut the money to the university system.
Only problem? Most state supported universities did not have a football program.
* turned to Chinese/Indian nationals (no financial aid/full price)
* raised everyone's tuition (went from low-cost school to higher-end)
* raised other fees
My university is in a part of the US that has low cost of living. So at the very least, the profs are unlikely to be homeless - yes even at the wages specified.
There's something of a joke/quote in university circles about how universities went from "state funded to state supported to state located." When you get <= 5% of your budget from the state, the state support means nothing and this is the situation for most big state unis.
Also, you forgot the Saudi Arabian nationals whose govt fully funds their US education (though this program is becoming more restrictive in recent years with the Saudi govt selecting specific eligible degrees). I'm not trying to start a flame war here, but my personal experience is that many of these Saudi students struggled deeply, this fact was known by the administration, but the giant full-price checks were essential to keeping the lights on.
Other Fees: Yes, the rise of fees that are starting to rival tuition. Fees are a convenient way to lower the sticker shock of tuition while keeping the total cost the same. My former uni made a big promise about "freezing" tuition for years at a time, but all they ended up doing was jacking up the fees to compensate.
I'm just curious, could you elaborate on this? Are you referring to them not being able to cope up academically? Or the cultural difference or racism.
I've known a couple of these students back from my college and they seemed to be doing alright.
In SC funding went down while tuition went up and lottery scholarships went up. The university budget is around $800 million / year and the athletic department across all sports is about $80 million / year. Of that budget, it's entirely self sustained but it's also almost entirely spent. It's performance has no impact on the rest of the university except for alumni engagement/excitement. Even if it were making a $20 million dollar profit it's still a drop in the bucket compared to the total university budget.
At the same time, applications are through the roof and there's not enough room to take everybody (but they are investing in new buildings to increase capacity). The ratio of in state to out of state has always remained close to 60/30 though.
Admin costs have increased but the prestige factor has gone up significantly as well. Clemson went from being the #70 something PUBLIC university to a top 20 in the span of 2 decades.
Very sad that a Dem aided and abetted this.
I learned that for the budget of most research projects, 50% was administration. That is 50% wasn't for researchers, the lab, labor, or equipment. Part of the 50% administration budget was paying for the oversight of said budget.
I am no longer amazed when government projects go severely overbudget anymore. Its designed to do that. Not just the contractor but the government itself. Budget a project, get the vendor, project goes over, create a department to track costs, budget goes over, write a new law to track budget, budget goes over, create another department to track budget, budget goes over, write a new law to track budget, ad nauseum.
At the university level, arguably the second most important person in the administration is the President of the Foundation and they don't even technically work for the university as the foundation is a separate legal entity. This person has a massive private army of fund-raising staff that shakes the alumni money tree looking for gifts large and small.
But, that doesn't mean that universities don't also have money chasers. My former uni had a special division in the recruiting/admission office that focused on high-value future alumni, aka, children of rich and famous parents. These kids were brought in for tailored events, private meetings with faculty and admins, and received a general red carpet treatment to try and lure them to the university.
Related, Malcom Gladwell has done some great work dissecting the money making machines of the administrator class who go after $100M+ gifts (far more money than 99.99% of researchers could acquire in a lifetime of grants). And, how this gift giving continues to pile on at elite schools while starving everyone else of gifts.
Many of my social circle work as grad students at a private R1 institution, and I've seen a vast difference in the experience of students with a PI who self-funds and writes grants all day - less interaction, most of the teaching and mentoring is done by postdocs - and a PI that has an alternate source of funding like clinical work - grants are still needed but less often, much more hands-on input to projects, much more direct mentoring. It seems like the top universities could definitely chip in to help ease that load on the researchers: science as a whole stands to benefit.
At least at my university, there were a few technical writers that would help out writing grants and papers. AFAIK they were underutilized. It may vary by field and from PI to PI, but, in general, PIs wouldn't accept anyone else's writing. Everything that was penned by a technical writer was rewritten.
> most of the teaching and mentoring is done by postdocs
Or (best of all, in my experience) some older tenured full professors that don't care about getting funding anymore (or funding is thrown at them because they're famous). A few professors in their late 70's would spend entire afternoons helping me get something to work.
That may be further evidence that the real drain on mentoring is the constant need to write grants. Sad that some really enjoyed mentoring younger people but they couldn't until they were about to retire.
Who would pay them?
Equipment is a special case in that "capital" equipment is usually exempt from most or all overhead. Smaller things are often not though, and some things are very difficult to charge to a grant at all.
And then they come to us for tax dollars, while also slashing their involvement with local communities. Why should education tax money go to universities that are so woefully inefficient and bloated with money when it could be spent on primary and secondary education, or better access to preschooling?
1) Nearly all university students are making their decision when they're 18, so a lot of them are mostly interested in non-academic aspects of their school
2) It seems really hard to actually start a new university. Anecdotally, every university I can think of is quite old.
3) This is probably related to (2). The benefits of universities aren't really dependent on good professors. The advantage of top schools in both educational progress and student outcomes can probably mostly explained by signalling, filtering out weak students before they arrive, university culture, and networking. None of this has much to do with professors.
This has a huge effect on pushing schools to the top. You can take a public institution (eg University of Washington, since they're notorious for this), put an artificially high bar on entering a program (eg UW CSE since, again, notoriety), and take in all the public money you want while only admitting the top 10% of students. The school is happy for the brain blast, the state is happy to fund a "prestigious" university, and the people are happy to fund so many smart students attending their university instead of another. Everyone wins, right?
Wrong. One of the many problems with this technique is how a university quantifies what is considered a "weak student". Is it low test scores? Bad entrance essay? No planning on the part of the student? Whatever Pearson Hall or McGraw Hill will set on their outsourcing offerings? It's a giant can of worms in terms of what individual strengths and weaknesses are, and how they can either enhance or limit academic performance.
For other fields, school prestige can help you get that first job, but the value of being able to say you were educated at X drops quickly once you start working. Once you've been working for a while, your portfolio, social skills, professional network, and track record are far more important than the issuer of your degree.
And, yes, there's a market for them.
(This is also what a lot of private, non-research schools sell themselves as, and some actually fit the bill without being abusive frauds, though the money available and historical ease of getting away with being an abusive fraud in that area has made it hard to find the wheat among all the chaff.)
We already outsourced custodial and food service. Do student accounts and financial aid really define us as a university? Are they part of our core competency? Do we do them particularly well relative to competitors?
Of course not. So why not outsource them?
This problem is further aggravated by the fact that students often choose Universities based upon reputation and are therefore rather insensitive to price increases, so extra spend on administrative staff is easy to find. In addition to this the government gives away student loans that matches rising tuition costs and because you can't get rid of student loans in bankruptcy it is an attractive asset class even for private lenders.
Other situations are complex - grants/external funding impose substantial administration requirements (while being larger and larger shares of revenue) or require substantial administrative expertise, often require things like outreach programs which PIs don't want to deal with, etc.
But at the same time, I think administrative salaries are frequently unconscionable.
"...some of this probably relates to a difference between personal versus institutional risk tolerance."