Classes are only a fraction of the in-person university experience in the USA today. Schools are priced accordingly and they heavily invest in and compete on non-classroom factors. This includes facilities, support services, activities, athletics. Then there is just the socialization, dorm experience, meeting people, parties, on and on.
So it would be a totally reasonable and logical next step in history for schools to say, "Hey, we believe that a lot of the value we offer is the in-person experience. So we will offer remote classes for a lower cost, to many more online-education students." Like Ga Tech's masters program.
But, with COVID, the universities have backed themselves into a corner because they should probably make the next year remote, but they made all these expensive on-campus investments and to stay afloat they need the ridiculously high tuition money (including from out-of-state and international students as mentioned). So all of a sudden, they can't afford to admit that remote learning is a much less valuable product.
(One thing I am ignoring is the real difficulties and cost from professors to teach large remote classes in an actually effective way. I'm sure administrators will ignore it too.)
It's ironic isn't it. What would the cost of purely attending classes and exams be? $2,000 or $3,000 a year?
But no, if citizens wish to educate and certify themselves (from a reputable institution I might add), they must purchase the full package including all the bells and whistles that come with societies, events, parties, support, guidance councillors on the side.
It's really time to introduce a "I just want to learn and sit for exams" package for those unable or unwilling to pay for the extras.
Maybe the reputable organizations have a more comprehensive approach because that approach is more effective?
This is exactly the issue and I am in no way doubting that these approaches are more effective. However, at some point, a decision was made to choose between a fundamental trade-off of "maximizing the effectiveness of the experience" and "minimizing the barriers to entry".
I have heard of discussions in Germany (using as an example because I know what's going on) around whether the top tier universities (there is no real top-tier, they are all very good, but some have made a particular name for themselves) should charge more so that they can offer a more varied and dynamic experience.
But there are strong arguments on both sides, in particular against this approach because it goes against the fundamental idea that financial means should never become a barrier to entry - at any level.
At the end of the day, if you are lucky enough to end up at a brilliant institution, you will find a way to cope and get the most out of the experience (maybe you must rely on less support, and activities and events are more in the hands of self-organising students). That, in my view, is a better scenario than having potential students/classmates not be able to engage with each other due to means that are (often but not always) outside of their control.
> arguments on both sides, in particular against this approach because it goes against the fundamental idea that financial means should never become a barrier to entry - at any level.
These ideas are not incompatible. You can both raise prices and ensure that financial means are not a barrier to entry. Simply take into account the financial means of the applicant. Most of the top schools in the US already do this and are tuition-free for students with lower (or even middle) income.
This is a problem for the exact reason you mention, but also:
> This includes facilities, support services, activities, athletics. Then there is just the socialization, dorm experience, meeting people, parties
> ...they need the ridiculously high tuition money (including from out-of-state and international students as mentioned)
And these students from China, India, and elsewhere don't care about the great basketball team at the university or the football tailgating events or the pool parties! (or at least not as much :)
The bloat has also resulted in a proliferation of degree programs by focusing on what can make the schools the most money. Thus not providing the real-world/educational requirements that may be necessary but not as "fun".
One of my colleagues teaches intro physics to 1000+ students remotely. It seems to work quite well actually. His TAs screens the questions students have -- it's not like there are 1000 different ones, many a similar/same. He says the large audience is actually an advantage: If he explains something bad/wrong, at least some of them will pick up on it and ask.
I guess it really depends on the type of student. If I were a college student now, classes alone indeed are a fraction of the in-person university experience. Besides classes, I'd also need:
- Challenging homework and rigorous scoring and feedback -- this would be critical to my future success. Standard tests won't cut it as I need my professors and tutors give me feedbacks on my proofs, my homework, my assignments, and detailed and usually lengthy and overly verbose reports.
- Opportunities to work on a research project with my professors and senior students.
- Office hour with professors.
- Seminar-like classes that students and professors dive deep together on a subject.
Things I'll miss: libraries. I used to check out 10 to 20 books a time, and spend hours and hours in my college libraries reading them and doing assignments. Classmates turn to friends. There's so much fond memories of hanging out with friends I made in college. We can probably make it up by hanging out online, but the experience will be different from face-to-face interactions.
What I don't need and will be glad to see them gone: all kinds of crazy sports programs. Yeah, football or basketball or whatever. They're nice, but I'm not sure if they are multi-million-dollar nice, at least not to me. I don't give a rat's butt if Caltech didn't win collegiate basketball for 37 years. So what? For all I know, Caltech will be the best college as long it provides the best STEM courses and produces the best research results in the world.
From a selfish and ideological standpoint, I would love to see non-academic aspects of universities dismantled.
We get a lot of ego gratification every time our deans stand up in front of the faculty and say, “This year, we didn’t reject 85 percent of applicants; we rejected 87 percent!,” [...] That is tantamount to the head of a homeless shelter bragging about turning away nine of ten people who showed up last night.
There's an opportunity here to scale education from institutions all over the world to a level never possible before. Well...it was possible for many years. But the incentives for the institutions commit to it never appeared until now.
This is a hard problem. But the mantra "scale education" ignores the quality side, and somebody who is shuffled through a "scaled" experience without personally investing and then given whatever credential at the end is still not actually competitive for a job and isn't being done any favors - they've just invested a lot of time in something without much personal value. What's worse, they've diluted the perception and thus value of the educational experience for anyone else who attended the same institution, even those who did take it seriously and learn.
People still send rejections for bad reasons (ego, etc.), and the cherry-picked 85/87% example seems illustrative of that. But the answer to education isn't scale - it's quality and specialization. Not everyone really wants an academic experience, as anybody who has spent serious time at a school can attest to (throwaway because I'm in that group).
Now with the pandemic, an institution's survival depends on this accessibility. And the bonus is that solution to this issue can greatly increase the accessibility to many more.
Certainly agreed that existing institutions (educational and otherwise) need to figure out how to effectively function remotely. Remote accessibility is a separate problem from scaling and reducing entry requirements - Harvard can (and likely will) have essentially the same students, just using Zoom rather than walking around campus.
And the point of my argument is that the main goal of admissions requirements and selectivity is in fact pedagogical (supporting an effective learning environment). Yes, there are branding/ego aspects, but again this is not a simple discrete 0/1 problem. The space is continuous, and "scale" is simply not the answer.
If your goal is making great knowledge readily available, then MIT OCW and many other existing MOOCs have really already solved that. They do it by simply putting it out there, but not formally evaluating/crediting participation. This is all great, and is I believe the right way to scale accessibility without diluting the focused experience of the institution itself.
But no, it is not realistic to expect higher education to magically fix all of this at the end of the pipeline by simply having a wider mouth. That's going to/already has significantly harmed the actual value (and signaling value) of a higher degree. It just doesn't fix the problem to more widely apply a treatment that only works when certain prerequisites are in place - competitive programs look for those prior experiences because they really do prepare you for deeper study.
There of course should be (and are) undergraduate institutions that are not that competitive, whose charter is to educate those with relatively less opportunity earlier in life. The 85/87% numbers in the original parent comment however suggest that they are discussing either a competitive department or graduate program within a university - and their charter is and should be to advance the truth and quality of their field.
In other words - yes, give folks a chance. But don't expect higher education to fix a bunch of upstream socioeconomic problems (try to get primary education fairer first), and "give folks a chance" means "broad access to general undergraduate education" (and then still some standards, i.e. being willing to fail folks or guide them to alternatives), not "open all the gates to everything and scale it up."
This is a ridiculously elitist, and unsustainable view, IMO. We should err on the side of giving people opportunity to succeed, rather than discard them. These are _people_ remember. Just because they were screwed over by the teacher union, they do not cease to be people, and some fraction of them might go on and achieve great things if given an opportunity. There will always be "socioeconomic" problems. This is further exacerbated by underdevelopment of vocational education in the US. In USSR/Russia, unless you actually want to study, you didn't make it past grade 9. As a result for the last 2 years of school it was no longer "fashionable" to be a shithead - all the shitheads were gone.
The dilution you're so worried about won't happen if it's structured like a MOOC. In fact the MOOC could (and probably should) be administered concurrently with the last couple of years of high school.
In the USSR (and Russia as it was in the 90's I don't know about now), the failure of high schools to provide sufficient education for "hard" higher education was implicitly acknowledged, and there were multi-year, free mail-based education programs administered by top universities which would, if you put in the effort, give you a better shot of passing the entrance exam, even if you're in a village high school where the physics teacher can't tell a Joule from a hole in the ground.
You'd receive printed materials and homework in the mail, and you'd be expected to study on your own (or perhaps with a bit of help from your high school math and physics teachers, ours were happy to help), and complete the homework. The explicit goal was to prepare you for the entrance exam, so it was kind of pointless to cheat or slack off.
These programs did not have limits on enrollment, but they'd get pretty darn hard towards the end, so a lot of people would wash out.
Given that US high school system as a rule sucks major balls for most people (in spite of being very expensive), I'm advocating that this understanding be formalized, and the first, optional, year is added to the higher education curriculum to focus on fixing that damage, followed by an in-person entrance exam. Feel like you can do the exam without extra preparation? Proceed straight to the exam. Feel like enrolling sooner? Take the courses in addition to your last year of high school.
Re: the rest - read my upthread comments, suffice it to say I am supportive of broad opportunity and personally am a fan of MOOCs. But there's a difference between making material available and providing a supportive curated learning experience (which is what people expect when admitted into a university, and is essentially necessary for a research environment), and for the latter holding students accountable to standards really does matter.
It sounds like you have some interesting ideas that could be applied towards improving primary education (intermingling with MOOCs, etc.). That's great - as I said in my first response to you, advocate for those (I also agree that vocational education could help - that's part of what I was hinting at with "specialization" and "alternatives"). I particularly like your description about how it really did gate on the actual difficulty - that's pretty infrequent in these initiatives from what I've seen. That plus a robust test at the end could lead to a trustworthy credential used to apply to future opportunities.
Research institutions and competitive/graduate programs need to maintain quality in order to advance their field, and need to be selective to do so. The best fix for inequality is to fix it at the source, but to maintain an insistence of the actual value and integrity of the material. It hurts both individuals and the system to dilute and give people participation badges for passing through, and that is the approach I have more commonly seen when people push for "broad accessibility."
Crucially, I'm not arguing in favor of lowering the bar. If anything, making the mouth of the funnel much wider could _raise_ the bar instead as more talented kids will make their way through it. I'm arguing in favor of providing more structure and more targeted training to people who would currently fail to overcome the bar essentially due to random chance of having a shit math teacher in high school.
They _could_ in theory do self-directed learning, but without structure and a goal this gets old rather quickly and people give up.
If what you're talking about is more (robust/tested) MOOCs as a way to qualify for higher education, more structure/accountability for self-directed learning/availability of material, and a wider funnel in terms of more vocational education/opportunities - I'm absolutely on board, and would see those as suitable improvements to primary education. I'm just giving you a heads up that, in the US at least, most actual initiatives that try to "spread opportunity" don't do any of those things (too much work/investment needed), but really do just push for lower bars in higher education (wider funnel in terms of "let more people in to the thing we're doing anyway") and result in dilution of material. And yes, that does harm the value of the degree, as demonstrated also by jobs now looking for graduate degrees when in past undergraduate would have sufficed.
This is absolutely an area where the US could stand to learn from the standards and practices of its global peers. Sadly that is unlikely in the current environment.
Scaling access to bachelor’s degrees all over the world will only increase the demand for masters and PhDs.
You make a great point. In my experience some of this is intended thinning of certain majors. For example, P-meds at my university always got a kick at how many people gave up that major after their first Zoology class.
Those kids now have to catch-up in order to complete their art history or general business major that they never wanted in the first place. The whole process we have created needs to be re-evaluated.
What would be worthwhile is if employers identified what skills or knowledge or experience they actually wanted and figured out how to both test for those (and not just trust the degree-granting system) and train for those (because the degree-granting system doesn't reliably supply it).
This is difficult for them to do so, due to stupid laws. If they had a hiring exam, and it happened to show a certain race performing worse on it, they would get sued for racism and lose. So instead, they just do the safe thing, and let collages filter out the undesirables for them.
- Many employers do in fact have hiring exams: see Triplebyte, HackerRank, etc. in our own industry, but also civil service exams etc.
- Colleges themselves rely on exams (like the SAT) with known racial disparities in scoring for admission.
- Colleges completion rates vary by race.
So one would think that employers would already be sued for racism if that were possible.
(But, perhaps the argument is that employers want to filter people by race, and they wouldn't be able to do so as easily with a hiring exam as via indirect means like recruiting from certain colleges.)
In my experience talent is way more evenly distributed than the recruiting practices of some industries would indicate. The best people whom I've worked with went to a mix of extremely fancy and very ordinary colleges. A few didn't go to college at all. It's a real shame.
Too many students or "students" wind up paying for the party for years or decades after they graduate. The incentive structures in academia are (mostly) not good and not properly aligned.
As a tangential nitpick, I was distracted by this statement: "The average age of a tenured professor is 55, meaning if you meet a 40-year-old tenured prof, there is someone at 70 teaching..."
I think it unwittingly supports another point from the article: "A combination of self-aggrandizement and elitism has convinced American universities that our services are worth indebting generations of young people, and now risking becoming agents of spread."
Pretty insane really.
I'm talking here about Physics, but I assume it's similar in other disciplines of the hard sciences.
This is less so a problem for higher classes -- you can write take-home problems which are hard and uncommon enough that you can't google-solve them. But even then, students will solve them in a team. Or in other words: The best students will solve it, and everybody will have the solution.
For lower classes, almost by definition, the problems are all textbook. Yes, you can change numbers around, and formulations, but with some google-fu, most problems are shallow :)
The situation gets even harder if you want to be fair to the students who are not on campus. How good is their internet? How do you handle "my internet went down"? Or: I'm in this other timezone, your test is in the middle of the night! What I did in the last semester is allow the students to take the test at any time of the day, and give ample time for the test itself. But of course, that gives them more chance to communicate, and help each other. Even randomized tests (i.e. with changing numbers) do not help here, because often the complication is finding the right formula. Putting in other numbers is then trivial.
There are only proctored exams, but they only help if you use a locked down browser. I'd rather not put what amounts to spyware onto my students computers. AND they are pretty easy to fool anyway.
The only way around that are taking tests in testing centers. But there is an issue of scale here, especially in COVID times.
The second problem is lab classes. It's very hard to teach experimental setup techniques online. How do you use a scope? Can you find a bad connection? Why did this resistor just blow up? You let the students play with sometimes quite expensive equipment, you can not expect them to just have that at home.
Third, online teaching is a very bad experience for many students. Not so much of the class itself -- I think this can often be remedied. But things like the exchange with other students, the opportunity to ask the professor face to face, but also so basic things like having to get up and out of your room for class instead of watching it (or a recording) from your bed. Not all, but many students struggled hard to motivate themselves to study sitting at home.
I do believe that it's better to have no in-person classes, or maybe a bare minimum. Teachers will try to teach the students as best as possible, but make no mistake, the education of many students will suffer.
A) Project-based testing proving critical thinking rather than memorization, and
B) Testing with the understanding that as long as you can find the answer via Google, notes, or a book in a reasonable time frame, this shows you understand enough to proceed? Is this not how much of the working world works as it is?
I try to formulate most of my questions to not be just knowledge. With higher classes, I care less and less about how much they know, or just know how to find it, because the real problem is figuring out what they need to know, so to speak. For lower classes, this is a lot harder, because these classes often teach the basics required. You cannot be a good physicist if you have to google F=ma every time. The same way you can't be a good programmer if you SO-search for -loops.
At Warwick, we gave 24 hour windows for most online assessments. This resolves the timezone issue. Internet/connectivity and other claimed technical issues are handled by the existing mitigating circumstances process. Students are expected to provide evidence, and the development team is also involved in these cases (providing log data which may or may not support the claim). We have pretty good technical data, so it's usually fairly obvious as to whether the student's claim has any merit or not. The main advantage we have here is that we built the entire platform in-house when the realities of Covid-19 became apparent.
Some departments have looked at "viva voce" interviews after the exam as an anti-collusion measure. We already had the infrastructure to do source matching on e.g. coursework, so typed-up exam solutions were also fed into Turnitin in the same way.
Is it flawless? No. Students can still technically start the exam early and share the paper with others who start later. But I'd have expected to see much higher mark averages if cheating was widespread - and some of the informal discussions I've had with teaching staff have suggested worse performance this year, if anything. Part of this may be due to the incentives to cheat not being particularly strong due to the "safety net" policy which was implemented - students cannot be disadvantaged by taking the exams - their average mark cannot decrease, and will revert back to a grade based on non-exam assessed work if they perform worse.
As you say, this is less of a problem in higher-level classes: you ask them deep and meaningful questions that build on the intro material and you see if they can give a cogent answer. This process already has its natural conclusion when we ask Ph.D. students to write and defend a thesis - but almost until that point, we continue evaluating students based on homework assignments and tests with clear right or wrong answers.
It seems to me that the answer, then, is aligning the incentives. For any classes that have only textbook work, tell students that everyone passes them, and while homework and tests will be offered, they're mostly there so students know for themselves if they're ready to take higher-level classes. If you want to teach students to be students (which is a valuable skill on its own, although most of them should have figured it out in high school...), find some freshman-level classes that don't have textbook work - say composition or research - and grade those. As the years progress, move towards classes where students offer free-form responses and assign each student different work. If students seem like they didn't actually learn the intro material, go look at whether they took exams and how, and then send them back until they do. A few will party all freshman year, sure - but they'll learn that their sophomore classes do actually mean business now.
This does, incidentally, involve getting rid of the general requirements classes unless you can evaluate them in a non-textbook way. You probably can just work it into the major - biology or computer science or music composition students will genuinely benefit from understanding introductory physics, calculus, etc, so test that in those classes (and make it clear to them they need to achieve a "passing" grade in those classes). If you can't, and you also can't ask students meaningful independent questions, there's no point in having your degree require the ability to memorize "the mitochondria is the powerhouse of the cell." I passed my college biology class with a C, and I am unsure what exactly would have been different if I'd put in the work to get an A.
Couldn't they also just use a second computer?
Additionally, it puts another "barrier of entry" in there: Rich kids can buy more electronics to cheat better. Or pay somebody to do the manipulation for them.
If you open a second computer they should know about it.
But employers will be tempted to use the same tech to measure employee time-on-task, and write employment contracts accordingly. Parents and teachers would want to use it to monitor study as well as exams.
The potential for social control is rather alarming.
The only real value a university confers is certification. Grades are really the result of a couple exams per class. The value of top universities' name on a degree is also largely built on the selectivity of their admissions process. But those students that they admit are not valuable because of the university - they're valuable regardless. So why not pivot to a world where companies administer certification tests for different subjects, at different degrees of difficulty/proficiency, at a low cost?
I think the question is better phrased 'Do universities IN THEIR CURRENT FORM actually make sense anymore?'.
I used to share your opinion, especially after completing my bachelors but now that I am back in the system doing a masters (in the EU), I 100% see the value of university education (though I believe it should be free for all).
The idea that you can meet in a room, full of likeminded individuals and be taught by experts in their fields, given deadlines, exam pressure, assignments and struggle together with other students carries weight. Most especially if you are interested in science and research, in investigating and discovering, you will be taught the rigour and discipline involved in this.
Science is as much about the community as it is about the subject itself. All research builds off of previous ideas/experiments and many well-known scientists often conferred regularly and deeply with their counterparts around the world.
I think universities are an important place where young generations of future minds can come into contact with the members of these communities.
The sciences, medicine, and engineering disciplines do. Computer science is a rarity in being a high-paying field that doesn't.
(Although, given that the average developer doesn't really require a CS education, as many on HN say, I wouldn't count on CS remaining a high-paying field.)
Universities are afraid that by charging students $20,000 per year in tuition for an online degree, people will begin to question its value and possibly turn to other alternatives (perhaps, maybe also enrol in universities abroad that have low or no fees). Thereby pulling the curtains on the whole tuition fee charade, which is that the lion's share is used to find non-academic related expenses (I'm guessing: administration, outreach, marketing, a nice website) 
Universities that rely heavily on large sums of money from student fees are now faced with 2 choices:
1) Charge close to the same amount of tuition fees for an "online" university experience
2) Re-open and continue with business as usual, hoping the risks of Covid-19 transmission can be mitigated
Option 2 apparently seems like the more attractive version to maintain the status quo, which leads to attempts at convincing people that restarting face-to-face teaching is a reasonable way to go forward.
If this is what is going on, then this is yet another issue with universities that rely too heavily on income from tuition fees to survive.
The primary functions of (public) universities is to educate the population and conduct research. The ability to perform those functions must be prioritised and subsidised via public funds to guard against the influence of financial stability creeping into the decision making process on how to conduct activities for these primary functions.
I have studied in both the UK (9,000GBP/Year) and Germany (<€500 per year) and noticed a massive difference in the way both were run and my life as a student in each.
Is that really normal? As I understand it, most major universities’ revenue is driven largely by research, and the total student fees barely cover the costs of dealing with students; it's true that students paying full sticker price provide surplus revenue, but that surplus revenue largely goes to subsidize the costs of students paying below cost fees.
This really ramped up in the mid to late 1990s when we were told that Universities should be run more like businesses.
“The Idea of a University” by John Henry Newman
If you look at the stats, Covid is less deadly than the typical flu for those under 30. Time to stop destroying the youth by de-educating them. Stop being so scared. Let locals decided how to handle things. Stop picking isolated stories and applying blanket rules.
Even if it were merely as dangerous getting the flu at 30 doesn't result in you infecting as many people and not as many of those people subsequently die. Many people pass it on before they get any sort of symptoms and therefore are unable to just stay home and not pass it around.
Please stop passing around dangerous misinformation. Your advice could literally kill people.
This also involves an invalid comparison: College-age students, if they were broken-out by age among flu deaths, would show a very low death rate from flu, and COVID-19 death rates would be 5X+ higher, as they are proportionately for other age groups.
Now let's talk about sacrificing the faculty on the alter of reopening.
I know it is for me (age: 55 y.o. or so), and while it isn't being currently problematic, but I have no idea about how it might present in 20 years.
By "less deadly" you mean "once infected, more people survive covid than the flu," right? But isn't it more infectious - and doesn't the longer incubation time make it more deceptive? It's fairly easy to control flu outbreaks, even in environments like college campuses, by telling people to stay home. (And you usually don't even need to tell them that, because they'll feel too sick to not stay home.)