Let's take history, which seems very what, full of dates and names. A great course in history teaches you how to think differently and with deeper context about world events and your country's politics.
I suggest that a society only interested in the workings of machines rather than the workings of people will soon treat people as mere machines. Let the nightmare begin.
Where the author is correct is that in those fields the output isn’t falsifiable. Your math skills allow you to construct proofs that can be verified. Your study of history allows you to write accounts explaining and putting in context historical events in a way that is pursuasive to other people. But they are not falsifiable.
I agree with holding the former in higher esteem. Being able to communicate with people is important, but it’s an impoverished basis for an education. It’s terrifying that many students manage to graduate without much exposure to the world of objective reality and truth that exists around them (and which makes their lifestyles possible).
The content is not wrong in the same way a math question is, but many humanities classes are explicitly about taking arguments presented in essays, books, journals, by the government, by public interest groups, by private industry, and testing them against "objective reality and truth that exists around them." No humanities course just takes every argument at face value. Every argument is subject to intense scrutiny.
At least from my takeaway, just saying someone or an argument is "wrong" is not really what the humanities are about anyway. The humanities focus on the reasons people have and give for their claims. Often reasons are complex and are tied into complex human contexts. Reasons are not just evidence, they are the entire baggage of argument, logic, context, culture, and history. The humanities focus on understanding those reasons. Whether deciding whether those reasons and claims are wrong is important, but not the entire purpose of the humanities, and never was.
To be sure, there are people who come out of humanities programs with distorted views of the world. I have met many, and it worries me in some ways what more and more do to the humanities.
But there are also those that come out of STEM fields with wildly distorted views of the world as well. And I think that is because they lack a solid humanities education.
Here is Orwell on the matter:
>"When the nautical screw was first invented, there was a controversy that lasted for years as to whether screw-steamers or paddle-steamers were better. The paddle-steamers, like all obsolete things, had their champions, who supported them by ingenious arguments. Finally, however, a distinguished admiral tied a screw-steamer and a paddle-steamer of equal horsepower stern to stern and set their engines running. That settled the question once and for all."
>"It doesn't matter how beautiful your theory is, it doesn't matter how smart you are. If it doesn't agree with experiment, it's wrong."
People lived for a very long time without a proper appreciation of controlled, repeatable experiments, and progress was very slow.
1 - www.orwell.ru/library/essays/lion/english/e_saw
Who knows if it is even possible to model human interactions?
With physics, you can build a machine an test to your hearts content to determine what the underlying rules are. That's real tough to do when you're studying, say, an economy.
So, i don't mean this as a slight to physicists. It would never occur to me to use oil drops to measure the mass of an electron. It was a brilliant insight. But, with modern tools, I kinda think i could replicate that experiment in my apartment. Evolution? I mean, golly, that's a really subtle insight. I might be able to do something with petri dishes and poisons, but that seems like a pretty tough thing to detect. I'd like to compare Darwin to Newton, but Darwin is probably closer to Aristotle. We haven't begun to get to the really good stuff yet.
I'm skeptical of phycology, there are issues with replication all the time. But clearly they're on to something. The whole advertising industry is built on psychological insights.
I've dealt with crazy race conditions that make me want to pull my hair out. They're not consistently reproducible. Eventually i work out the logic and things fall into place. But i have access to the source :) I can't imagine how hard it is to get anything out of random interactions of black boxes. Social sciences just aren't for me.
Anyway, yeah, i believe it's possible to model human interactions. We do it all the time. As with all things, some models are just more useful than others.
I used to know someone who had nothing but contempt for developers and the IT team, because while he was busy playing the corporate ambition game, they just wanted to do a good job.
As far as he was concerned, this made them easy prey.
Guess which kind of person runs the world?
Engineering and science won't teach you this. You can finish your PhD with a completely unrealistic view about how politics works, and how political outcomes are generated.
Neither science nor engineering are immune to this. Popular beliefs and high-status areas of research are decided politically, not dispassionately.
It's tempting to say that things would be better if we had dispassionate objective AIs deciding policy, instead of individual and tribal ambition - but of course one of the challenges of AI is that instead of simply automating math, AI has the potential to automate and amplify influence and persuasion.
When you don't really understand influence and persuasion - but others do, in practice if not in theory - that's not necessarily a good place to be.
They also have tendency to take conflict like the above, completely ignore politics and motivations that drives the arguments (meaning involved money), pretend it was all technical question people magically could not figure out until someone did something simple.
> People lived for a very long time without a proper appreciation of controlled, repeatable experiments, and progress was very slow.
And now that we strive to only see what we can count and repeat, and worship our own creations as objective reality, progress seems to shift into the pathological. Even just the concept of "progress" as some kind of thing you can get more and more of, rather than a journey on an tree with infinite branches every moment, is in lockstep with that and betrays an incredible impoverishment of thought. Every single great scientist who also loved wisdom seems to have pointed out something along those lines which mediocre scientists seem to ignore.
Hannah Arendt in Vita Activa has a bunch of relevant parts, here is one:
> The rise of the natural sciences is credited with a demonstrable, ever-quickening increase in human knowledge and power; shortly before the modern age European mankind knew less than Archimedes in the third century B.C., while the first fifty years of our century have witnessed more important discoveries than all the centuries of recorded history together. Yet the same phenomenon is blamed with equal right for the hardly less demonstrable increase in human despair or the specifically modern nihilism which has spread to ever larger sections of the population, their most significant aspect perhaps being that they no longer spare the scientists themselves, whose well-founded optimism could still, in the nineteenth century, stand up against the equally justifiable pessimism of thinkers and poets. The modern astrophysical world view, which began with Galileo, and its challenge to the adequacy of the senses to reveal reality, have left us a universe of whose qualities we know no more than the way they affect our measuring instruments, and — in the words of Eddington — "the former have as much resemblance to the latter as a telephone number has to a subscriber." Instead of objective qualities, in other words, we find instruments, and instead of nature or the universe — in the words of Heisenberg — man encounters only himself.
> [The German physicist Werner Heisenberg has expressed this thought in a number of recent publications. For instance: "Wenn man versucht, von der Situation in der modernen Naturwissenschaft ausgehend, sich zu den in Bewegung geratenen Fundamenten vorzutasten, so hat man den Eindruck, ... dass zum erstenmal im Laufe der Geschichte der Mensch auf dieser Erde nur noch sich selbst gegenübersteht ... , daß wir gewissermassen immer nur uns selbst begegnen" (Das Naturbild der heutigen Pkysik , pp. 17-18). Heisenberg's point is that the observed object has no existence independent of the observing subject: "Durch die Art der Beobachtung wird entschieden, welche Züge der Natur bestimmt werden und welche wir durch unsere Beobachtungen verwischen" (Wandlungen in den Grundlagen der Naturwissenschaft , p. 67).]
My sloppy/literal translation of Heisenberg:
"When one attempts, beginning with the situation of modern science, to feel towards the fundaments that have been set in motion, one gets the impression [..] that for the first time in history man on this Earth is only facing himself [..] that we kind of only meet ourselves"
"the type of observation determines which traits of nature are defined and which traits we muddle with our observations"
The reason we have a "left" and "right" in politics is it is impossible to prove one true and the other false. No matter how intense the scrutiny or clever the arguments.
But we live in the world of people, and demonstrating proficiency in only one insistent cultural mode because it is the most dedicated to rationalism is an impoverished worldview.
The question is, to what extent the current shape of humanities is aligned to those principles?!
The humanities teach how to look from multiple perspectives, our values, what we believe and why, our self and our relation to society. It teaches critical thinking and analysis of complex, nonquantifiable factors, such as: should we declare war on North Korea? What does it mean to be Chinese American? Should I trust this person, website, or TV station? The humanities are about life, and virtually all the themes are immediately applicable in daily life.
For what it's worth, I enjoyed Prof. Rota's piece and didn't particularly object to his "what"/"how" distinction. I agree with commenters above that it's a misrepresentation of historical research, which can be deeply empirical. Unfortunately a huge amount of history education, especially k-12, does boil down to a rote memorization of past events, so I can't really fault him for the generalization.
Surprisingly, even Macron articulated that maybe Western world needs "great story arcs", which is code word for SHARED IDEALS.
War is always an attribute of survival and the problem is that US population is pretty insulated from acute strife on the survival front. But the risks are there and can only be appreciated once you train in geopolitics.
Court cases are won or lost, but sometimes overturned on appeal. Even Supreme Court rulings can be overturned by later rulings.
Artists create art that can win or lose in the marketplace. Art can also gain or lose status based on how it appears to critics. Artists can be popular, notorious, or obscure.
Similarly for writers. Politicians win or lose elections. Nations win or lose wars.
Winning and losing in competitive situations seems more fundamental than falsification. (Take evolution for example.)
Arguably computing as a discipline is an important part of what makes contemporary lifestyles possible, and most of it, especially the practical parts, resembles the liberal arts more than it resembles mathematics. Especially areas like programming languages overall operate more through argumentation and analogy than through proofs. Sure, there are subsidiary proofs involved that can tell you something useful about the building blocks, but the type-related theorems you find in appendices of POPL papers are just low-level machinery, not the real goal, which is a usable programming language that people can do real work with. That requires constructing an overall language and ecosystem through a pretty liberal-arts style methodology: looking at what worked previously, reasoning what went well and badly about that, critically considering illustrative examples (often carefully constructed examples designed to illustrate a point, similar to the thought experiments in philosophy), and attempting to improve things on the basis of all that. If you look at a Rich Hickey talk, for example, that's pretty much entirely how he proceeds. And even in POPL papers, that's what you find in the meat of most influential papers (as opposed to the appendices): the real result of an influential paper is typically a qualitative argument about programming constructs, where the argument is convincing and well supported by examples, by not "proven" in any mathematical sense.
I think the virtue is not in the value of the arguments, but whether the argumenrs are falsifiable.
Programming has the virtue that it has to work for it to have value. Social sciences and humanities have to convince someone that it would work.
To pick a concrete example: Rich Hickey introduced transducers into Clojure a while ago, using an argument, illustrated by a number of examples, for why they're useful. Is this argument falsifiable? In principle some version of it might be, if you made "useful" more precise (useful to whom? in which contexts? how would you know?). But the kind of empirical work it would take to measure it in a non-toy setting is quite difficult, so afaik nobody's tested it, or even really formulated the question precisely enough to test it. In practice, you accept or reject the construct based on what you think of the argument, or you try to find a counterargument that makes them look like an inelegant/awkward construct, but in either case you probably aren't attempting to rigorously validate or falsify a scientific hypothesis relating to them. Basically all of PL design and evolution looks more like that than like Popperian science...
What I am saying is that there is nothing wrong with these arguments or with using them usefully. What is wrong is when nothing is falsifible or able to proven, either. So in other words, arguments are useful, but it does matter if it has to agree with scientific experiment.
Programming does, at the end of the day, agree with fundmantal truths for it to work. Its foundations are on the metal, and everything is reducible experimentability.
Perhaps everything except the human element: most aspects of modern product development involve programming language improvements that have to do with improving human interaction with a computer. But even here, we have a sort of market for ideas in that developers who adopt better ideas will be more successful.
Of course, it has to work. But a sculpture has to "work" too in the sense that it shouldn't collapse. Structural integrity is only part of the goal. Similarly for programs.
You mean PL design, don't you? Performance work is very empirical. If my new type of inline cache is better then I need to prove that and it's falsifiable (using benchmarks, which I admit aren't ideal).
> Of the 133 papers published in the surveyed conference pro- ceedings, 88 had at least one section dedicated to experimental methodology and evaluation
And anyway the claim was 'very rarely', when it's actually a majority of cases.
I disagree. Historians are constantly making predictions of events happening today, based on their knowledge of history. Some are accurate, some are not. They can then take this output, and use it to refine the views of history.
This doesn't even cover how history is written by very unreliable narrators, and historians have to do more than learn the "what" to uncover the "how".
Your argument is also somewhat limited to History, when Philosophy, Psycology, and other liberal arts degrees do not have the same limitations.
Two Dogmas would be a good place to end up.
It's also weird to suggest that no historical claims are falsifiable. That seems to put real history on a par with nutty conspiracy theories.
In my humble opinion the scarceness of falsifiability in history is the reason why it is so easy to create nutty conspiracy theories about history vs, say, mathematics or physics.
History is roughly as falsifiable as astronomy or biology.
At least for mathematics I have hardly seen any "nutty mathematical theories" (I am a PhD student in mathematics, but not a physicist, so I cannot say anything about physics here). The reason that I see is that for a math text to be considered as solid, it has to contain good, understandable proofs - which are hard to write by "crackpots". To be more precise: On sceptic's blogs I have of course seen links to papers containing "mathematical crackpottery", but I have hardly ever seen those "in the open countryside".
Ask your favorite well-known professor how many proofs of P=NP, the Riemann hypothesis, or whatever they get per week. Sure, they're all Time-Cube level crazy, but most off the wall theories are, too.
My PhD advisor actually works in an area that is related to complexity questions that are related to P=NP. So if he would get spammed a lot with this kind of papers, I am pretty sure that I would know. Similar statements hold for other researchers who work on the same floor.
You might object that even though they might work on questions that are related to complexity theory, they are perhaps simply not the people that an "ordinary crackpot" thinks of as target reader for their texts - in other words one has to know a little bit about mathematics to know that the questions these people work on are actually related to P=NP. I will not disagree with this objection.
You may also object that the respective persons are well-known and well-regarded in their respective community, but the general public is not so much aware of them. I also will not disagree with this objection.
Nevertheless I strongly believe that if your claim were true in general, I would probably know.
> But then why do you judge the conspiracy theories to be nutty, if they are on just as firm an epistemological footing as actual history?
But to elaborate on your argument: Because of the dubious epistemological footing indeed "actual history" has not the highest reputation to me. The reason I disregard lots of "nutty conspiracy theories" rather lies in the fact that many "conspiracists" have a tendency to find conspiracies in other topics, too, where falsification is much better possible. Thus I tend to judge by looking at the track record of the respective person.
This is exactly one example for my "where falsification is much better possible" - in this case because of physics.
But by picking over the details of a particular example you're missing the point. Surely you don't want to say that you can reject crazy historical claims solely because the proponents of those claims make outragerous claims in other areas too. If you think the moon landings are an exception, for some reason, then let's look at other examples. E.g. Holocaust denial.
Falsifiable means subject to test. It doesn't mean provabe on the sense of math. Only math is provable like math.
Could you provide some examples?
How does this perspective that people are stupid gain ground? It seems to be a self serving, self important and extremely judgemental perspective held by some insular techies.
The world is as much about politics, people and culture as it is about science. A failure to understand how the world works not just scientifically but politically, culturally and economically, the history of societies and accompanying philosophies gives individuals no context or understanding of their world. A one dimensional perspective both ways impoverishes rather than enlightens.
In its current humanities-style argumentative form, you somewhat contradict yourself: either your argument holds power, then why do you dismiss this family of arguments as inferior? Or your argument doesn't hold power, then why would you expect us (or even yourself) to be convinced?
In the end, I think education is also the veichel and a platform for students to have the intellect to question and to challenge established facts.
Mathematics isn't really a science.
Anyway, I concede people can be good in one way or another while also being a total shitstain. See Spacey, etc. But Peterson has always been too angry for me. He's like Colbert with no satire.
... in which he says that high-powered careers are very unpleasant due to hyper-competition among people who are exceptionally willing to tilt their work-life balance way toward work, and that most (but not all) of such people are men. Which doesn't sound like that unusual a claim, really: it's a fairly mainstream belief that women average more family-focused than men, and you only need a small difference in means between populations to get a big difference in representation at the far tails of a trait's probability distribution. (I would have preferred that he make the point with less bluster and more graphs, but that's just a matter of taste.)
Is that what you were alluding to, or does he somewhere else contradict this to make the crazy and very different claim that "only men have high enough motivation to be successful in competitive industries"?
About 7:00: "And you think when you're 19, 'cause you're so clueless when you're 19, you don't know a bloody thing, you think, "well, I'm not really sure I want children anyway". Like, oh, yeah, you can tell how well you have been educated. Jesus. Dismal. Dismal. Without fail.
"I've watched women go through their professional careers, many, many of them. It's a very rare woman who at the age of 30 who doesn't consider having a child her primary desire. And the ones that don't consider that, generally, in my observation, there's something that isn't quite right in the way that they're constituted or looking at the world.
"Sometimes you get women who are truly non-maternal, you know, by temperment. They have a masculine temper [-ment?], disagreeable, they're not particularly compassionate, they're not maternal, they're not that interested in kids. Fair enough, man. But there aren't that many of them and there's plenty who will not admit to themselves that that's what they most desperately want."
I saw it when one of my co workers started sending my Peterson vids. God, I wish I never heard of this guy.
Okay, I made the mistake of watching the linked video. At one point he claims the old boys' club doesn't exist at law firms. I mean...where did the term come from?
Well, sure, you see, those hyper-competitive men who happen to like that sort of environment would never, ever, do anything dishonorable or unethical to reduce their competition. It's purely "market determined right to the core".
 Yes, I went back to get the quote.
Hmm...pretty much everything you claim here is wrong.
First, his is not a claim from first principles or ideology, but an observation from many years working with at least one highly competitive industry: lawyers. So independent of what you want the world to be, this is what he says he observes in reality.
Second, his claim is not about "success". It is about reaching the top of whatever dominance hierarchy you find yourself in. Per definition, only very few people can be in that particular place, and his central argument is that equating these two (your personal success = reaching the top of some dominance hierarchy) is a losing proposition for almost everyone, simply because there isn't enough to go around.
Third, and maybe most importantly, he says that even if you are one of the few people who have the ability/luck/desire to achieve this very dubious form of success, it is almost certainly a losing proposition because it sucks on just about every other metric of having a good life, unless you are psychologically structured in such a way that this alone will make you happy.
Fourth, he notes that women are generally smart enough to figure this out. As are most men. There are a few men who don't, who will subordinate everything else in life to reaching the top of that dominance hierarchy. His claim is that most if not all of these people will not have good life.
And finally, as usual, this says nothing about individuals, but only something about populations. In his case, he says that law firms are doing everything to keep their highly qualified, super smart and highly effective female lawyers, but they can't, because most of them just drop out of that race to the top.
Are you claiming this is not true?
>"Frozen was propaganda, pure and simple."
What I did check, did not check out.
They show a video of Peterson arguing that being a CEO of a billion dollar company is basically pointless stupid competitiveness and men happen to be more pointlessly stupid in that regard, and the blog interprets it as him saying women should not try to be CEOs
That video is the center of the argument (the thing the post 'builds up to') in the blog's own estimation.
I wonder if you could actually get him to agree that everything wrong with the world is due it being run by pointlessly stupid, hyper-competitive men.
"Peterson has emphasized his desire to provide students with better and more affordable education. “There is absolutely no reason why high quality education can’t be made available to masses of people at low cost,” said Peterson in a recent interview on CTV’s Your Morning. “I think it’s a scam pretty much from top to bottom and it’s a very expensive scam.”
"He has not clearly stated when he will form the online university, though he said on Your Morning that he will soon start a website that will distinguish between postmodern and classical content and “cut off the supply to the people who are running the indoctrination cults.”
"Peterson also expressed that his online university would be an alternative to traditional universities, which he believes “have abandoned the humanities.”
"“About 80 percent of the humanities papers are never cited once and the humanities have been dominated by a kind of postmodern neo-Marxist, cult ideology,” said Peterson. “[The humanities have] abandoned their mission to students. Their mission should be to teach students to speak, to think, and to read, and to become familiar with the best of the world fundamentally.”"
 I think they're talking about this video (http://www.ctvnews.ca/video?clipId=1169448), but CTV doesn't seem to do transcripts and I live in the wrong "region".
At the end of the day, these guys always work for somebody, who is usually some sort liberal arts or business major. It’s something to keep in perspective when somebody asserts that math and science run the world.
Usually that someone is a liberal-arts major because they had a trust fund, not because the liberal arts turn you into an insightful, far-seeing, compassionate leader.
Hell, once the frameworks get mature, even the programmers don’t need to know much. My understanding of the math behind crypto is incomplete, but I successfully implemented validated systems with other people’s “Lego bricks”
Understanding how societies and civilizations reacted to different pressures and how the structure and nature of their institutions evolved and handled those pressures is much more important than remembering what happened where.
Those who fail to learn from the mistakes of history are doomed to repeat them, as they say.
I frequently forget the trig identities, but I know the basic definitions of trigonometry and that there ought to be certain identities, so I know I can re-derive them or look them up. But, I only need to do this when I teach calculus.
> You can also re-discover the dates by doing your own historical research
My point was that you can put a mathematician in a closed room with only pencil and paper, and they could redevelop an entire theory if needed. A historian in the same position would be unable to do any work -- they need primary sources. I went with your "learning lists of dates" as a stand-in, since, like primary sources, you can't derive dates from scratch.
Sure, history has a "how" of research methods, but the research products must rest on the "what." Mathematics is almost entirely "how." The "what" for a mathematician is mainly knowing what's been done and what needs doing, for the purposes of directing research and giving attribution.
Also, I am pretty sure G.C. Rota was being somewhat facetious in quite a lot of the article. These are "lessons of an MIT education." Lessons aren't necessarily truth.
Often mathematicians will make informal arguments but the idea is that you should be able to add rigor when needed.
I agree that the study of history is the best method yet found for recognizing the errors which produced those enormities, in order to avoid repeating them. But I'm not sure how effective undergrad history courses, or even history specializations, are likely to be in equipping their students with the tools for such study. Granted I've never taken such a course myself and so cannot speak with authority on even one example of the type - but I have found it not at all unusual for people who do have such education to be surprised in even so elementary and recent a matter as the American origin of the eugenic idea. Quite aside from being in my opinion a necessary antidote to the idea that the United States are somehow blessed with a permanent immunity to such enormities as we here discuss, such ignorance, on the part of so many supposedly educated in history and - if you're to be taken at your word - historical analysis, does not inspire enormous confidence in the value of such education.
Why? Do we have any evidence for that it helps at all?
Here's what I find weird: when one is talking about learning history, usually the "pro" side cites the importance of the subject, and the "no" side, the methodological "flaws". It seems that this discussion is a bit pathological: Everyone agrees that if a certain avenue of study can reduce the probability of genocide, it should be undertaken. The question is that maybe (probably) these avenues of study do nothing of the sort.
Of course, we can disagree about that. But how would we settle/improve our knowledge of the question?
That course's treatment of the causes and importance of the industrial revolution suggests that access to cheap energy is the key difference between an agrarian and high technology civilization. That idea and that particular history professor's connections to Washington (he was executive director of the 9/11 commission) helps to explain a forever war in the Middle East and climate change denial.
Context matters. If you are studying history when you are about to be promoted to be the general of your army, or pursuing a career in diplomacy these things make sense in the context of your career.
For general folks the value of this information for all practical purposes is zero. Compared to say an education in STEM where many concepts can readily used applied across other disciplines, as its just Math at the end of they day.
For this very reason, some civil engineering graduate can pick CS way easily than say with a graduate degree in history.
Compared to what? Clearly he doesn’t know how to make a historical argument — perhaps he considers such arguments beneath him.
Requiring an author to expand on every question a piece brings to the readers' mind is just an unreasonably high bar.
Indeed. I owe a significant amount of my professional and financial success to being able to communicate my/my teams' ideas and breakdown and communicate company strategy to my teams.
After a certain base level of excellence in engineering or another technical field, the distinguishing characteristic among levels of accomplishment seems to be communication, not a further 0.5% better at engineering. I see so many great technical people who are unable to communicate effectively and suffer badly from this.
I learned a lot of my communication skills studying the arts, and staging plays, musical performances, etc. My best teachers about clearly communicating expectations and deadlines, and giving meaningful feedback in a respectful way, have been people in the arts.
At my first hackathon, I got the same vibe I often did in the last few days before a play or concert. Then when I started running a civic tech project, more and more patterns with working in the arts became apparent. Lots of volunteers, everybody has day jobs, you need to provide people with opportunities to do work they care about, but not pressure. And you need to provide leadership without being a dictator, and meet external milestones with a "show must go on" attitude when the time comes.
These days I think that the creative element of programming is why software is eating the world at the pace it is. Undergrads can look at each other's code and say Wow! while one titration is fungible with every other.
That's simply not true. The reason that software is eating the world is that the rate on investment return is so much higher. Every time I read a publication from Phil Baran's group I have to pause several times to think it through, it's very deep, very subtle. Website is here, btw: http://openflask.blogspot.com/
It's actually worrying that computing has overtaken science and engineering. Back then, Arnold Beckman made a fortune in scientific instruments and enabled the world to do better science. Nowadays, Google and Facebook sit on huge corpora of natural language texts, and the only thing they do is build better AI for better targeting of ads.
Seeing past this level of understanding is an attrition gate in any subject. What I mean is, the students who got past freshman chemistry to become good chemists are the ones who saw the creative element in chemistry.
Those same students may have taken the programming course and seen different codes as fungible with one another. Or in math, proofs, or in English, essays. I know people who came away from the freshman programming course, with the conclusion that "it's all just memorization." Clearly it means they didn't get it, but it also means it's a widespread impression.
As for chemistry, for reasons we don't know, part of developing creativity as a chemist is to get what they call "hands," which means being able to take care of yourself in the lab without wreaking havoc on everything. You need to have a sense of what can and can't be done in a practical process, in order to see your way through a difficult project.
Advertising and GUI design are the sugar water of computer science.
The "coding interview" tells me that at least some aspect of programming is akin to titration.
But perhaps to generalize, the relationship between "look at how creative this discipline is" and "look at what people actually do with it in the real world" is an issue facing every academic field.
Agreed on the generalisation though. I'd even wager that the "creativity of this discipline" idea is what is attracting new students to fields that aren't too viable after graduation.
Do you not know anyone working at companies like Google? They have tons and tons of computer scientists working on serving the most applicable adverts as efficiently as possible to as many people as possible.
So far the society that treats people as mere machines is constantly mouthing platitudes about the humanities to itself, chanting, "Think about people! Think about philosophy!" as though this will actually alter the balance of class power.
USAians already take more humanities courses in a STEM degree than almost anyone else in the world. It doesn't seem to accomplish much to turn everyone into fluent humanists.
Mind, other countries specialize their students in early or late secondary school, so the differences may not be quite commensurable. A UK, French, or German student going into a STEM degree will probably have taken calculus and linear algebra by the end of secondary school, but they may not have taken an equivalent to America's AP World History. If you told them they ought to take a first-year university course in world history to get into a STEM degree, they'd look at you a bit weird, but this is admittedly how American admissions works.
I also know an American who basically took every available AP course, had a perfect grade average, volunteered in Girl Scouts, won valedictorian of her well-regarded high school. She ended up wait-listed to MIT, admitted with no financial aid to Cornell, admitted with a moderate scholarship to McGill, and admitted with a full-ride and honors to her state flagship school. She took the full ride.
(Also, we met and later married :-p.)
This contrasts with Technion (I did research there), where you could get into the top-flight STEM institution of the whole country just by earning sufficiently high marks in your matriculation exams.
It really does give me the impression that both other countries educate more rigorously at the secondary level, and Americans have a perverse fetish for inhumanly grueling competition. I think this may be precisely the opposite of a good system: a good system would assure that everyone has a very solid baseline, and then maximize the fraction of top talent that makes it through the training pipeline to do the most difficult work.
We are machines. Squishy, carbon-based machines with certain unique computational capabilities, I'll grant - but still, machines.
Is there any truth to this? Because it seem like classical university jingoism to me. "My institution is better than yours." Anecdotes like "All MIT graduates I've met were dumb" or "All MIT graduates I've met were smart" does not count.
Because I looked at the DE course he taught (18.03) and I completed harder math courses than that in my non-MIT education. I'm sure many other HN readers have too. I wonder if there is some test you can take to see if you are just as good as an MIT graduate?
The EU has done great work in this area by trying to standardize the university curriculum across the union. What that means is that a master's degree in computer science from the university of München is mostly comparable to one from Madrid, so name-dropping your university "does not work." It also means that it is trivial for a Spanish student to study one year in Germany and then come home to Spain (see Erasmus). The US system, where some colleges are rated higher than others for irrational reasons, is strictly worse.
The content covered in the curriculum and the speed at which it is covered is one potentially challenging aspect of one school over another.
The more important aspect, imho, is your peers. Speaking for myself and what I have heard others experience, peers can push your thinking to more sophisticated levels -- this can be in terms of something like elegance in problem solving (which has valuable real-world applications), the ability to apply the knowledge in a wider range of contexts, etc.
This is also often reflected, rightly or wrongly, in the assessment stages. If you take a class with wicked smart peers, the test is usually going to be much harder just so that the test can evaluate differences in knowledge.
I remember showing my $IVY calculus final to a friend of mine who set the curve in what was supposed to be the same class at $STATEU, and his mind was blown. It took him a few minutes just to realize that the test covered the same topics, while I thought the test was merely "hard" pedestrian content. Once I talked him through the problem, he realized how cool it was, and then he realized one of the key differences between our schools -- the boundaries of my thinking on topics were challenged and stretched much, much more aggressively.
To be fair, some non-elite schools are equally or more rigorous than elite schools, but this is usually on a department level and is the exception rather than the rule.
Well as long as you were able to demonstrate that you were better than him, it's all OK.
The whole thing was a pitch to get him to transfer (I think he would have easily gotten in), but he didn't want to leave our home state.
My point was merely that the school I went to pushed the boundaries of my thinking, while his (at least in that class) did not.
Did that make my school better? For my personal goals, it did. For his personal goals, it did not.
1. PISA is a widely administered test for 15 year olds.
2. TIMSS is a widely administered test for 4th and 8th graders (Advanced TIMMS does "last year of secondary").
3. We are talking about university -- specifically elite universities.
4. I am extremely familiar with education in East Asia (most familiar with Japan), and their teaching of the fundamentals of math in junior high school and high school are in fact excellent. Sadly, this does not translate well into creative thinking. The best math people in Japan are largely limited to a small number of schools (3 or 4 elite ones and maybe a handful of others that have some players).
5. While you are correct that the average quality of education in the US is lower than other places (there are lots of US-specific reasons for this), the education at US elite universities and the US high schools that feed into elite universities is quite a bit higher than the US average. It's tough to find hard statistics on this because it would reveal a class bias in the US (gasp, there is one!), but spending some time in these classrooms and talking to and working with the students from these schools is fairly convincing anecdata.
I'm not sure what you have against US schools and/or US elite universities, but I encourage you to open your mind -- there is a reason why we have a lot of world-class top-rated (by any measure other than cost) universities here.
3. What makes you think that MIT would be more elite than elite universities in other countries? Talent follows a normal distribution and if PISA and TIMMS show that the American mean is below the average, the American elite must also be below the average. So, if the most gifted American students are admitted to MIT and the most gifted Singaporean students are admitted to the Nanyang Technological University (NTU), the latter university will contain the more gifted students. And if it is true, that you claim, that the quality of the education is dependent on the skill level of your peers, then NTU must be (much) better than MIT.
4. You can't measure creative thinking and the "Asians may be good at math, but they aren't creative" thing is a cliche.
I don't have anything against US schools or elite universities. Rather it would be you who have something against non-elite universities because you are claiming that you can't acquire the same education at other non-brand name universities. My claim is merely that one can excel at any university in any country as long as one puts in the time and effort. Especially when talking about math. You need a book, pen and paper and solitude and you're set.
I would be happy to put my money where my mouth is and run down to my local university with your hard calculus final. :)
Which is obviously why only 12 of the top 25 schools, and 4 of the top 4, are American. I'd cite the logic failures in your argument as well, but I think letting actual statistics speak for themselves works better. And FWIW, I'm citing the university rankings that's published by a non-US publisher lest you think it's merely patriotic bias.
But having taken some physics and math courses at multiple universities MITs went much deeper in a shorter period of time. This isn't necessarily reflected in the syllabus because the topics may be the same but the devil is in the details.
MIT had absolutely fantastic problem sets that took me 10+ hours a week per class to finish and were rarely changed from year to year. This is because they've been tuned over so long that whether you get the answer correct is almost besides the point, the useful part for learning was the process of banging your head against them. This was true to an extent at other universities I've been at but usually the expected proficiency required to excel in a course was not quite so severe.
I ended up getting a PhD (in materials science), spent many hours working as hard as I could in lots of classes, and despite honestly knowing my stuff quite well I never got an A in an undergraduate physics course at MIT. Those were only gotten by the students who were obviously frighteningly talented, often with research experience in the course material already. Don't take that as sour grapes or anything, I am super proud of my B's in these courses. But MIT grades harshly, which may be part of why it's "harder".
Most MIT students stop caring about grades freshman year. Most MIT students interact with one another in class as fellow masochists struggling together against a common foe (learning the material) rather than competing against one another for grades. This is what I would personally call the weirdest and most advantageous aspect of MIT versus Ivy Leagues, but I actually think state schools are awesome at this too.
I'd have put that environment on the list of lessons way before any of the ones on there. Learning the value of close collaboration with people, regardless of whether you may think (probably wrongly anyway) that they're "better" or "worse" than you. Learning that when it comes to the real world, on a team you're all up against a way bigger opponent than each other, you're up against the laws of nature. And that success against that opponent is far more satisfying than any grade.
I think that mattered more than difficulty and speed, at least to me. Having smart people able to answer very specific questions and being encouraged to check their answers really helped with how I continue to view the world and to learn.
There are many schools around the world with much harder and more technology intensive curriculum than MIT. If you disagree with that statement, then you do see "value in competing over whose classes were the hardest." If you do not disagree with that statement, then you must agree that the "MIT experience" you describe can be gotten in many schools other than MIT. F.e, I also worked 10+ h/day 7 days/week year round at university and also felt that getting A grades were impossible.
I think the professor is making a great point by comparing math to sports. Yes, you can go to a sports university but you don't have to. The way to get good at it is by busting your ass of (and having good genetics). It's your own effort that counts, not the name of the institution, imho.
Yes, I think that there are schools around the world with harder curricula than MIT, with the caveat that every university has different strengths. I could go into my accomplishments in undergrad, but it's truly besides the point.
At some point it's truly ridiculous to compare as if it's a linear ranking. It's counterproductive to worry about rankings, and causes completely unnecessary friction between institutions. Pride and insecurity get in the way of solving problems, and all of us are ultimately struggling against a much more interesting challenge than each other.
I think the professor here is really not making a compelling argument, to be honest. None of the listed things are unique to MIT for sure, though strictly speaking he doesn't claim that they are -- he says that they are lessons of an MIT education, rather than unique aspects of an MIT education. He's a mathematician so this may be intentionally precise.
It matters very much whether you got your CS Masters from a place like Cambridge or ETH Zürich versus some small university in a small member state. In my experience (having studied and taught in multiple European universities, both before and after the standardization) Lesson 6 very much applies to the EU.
The main achievement of the EU standardization is that a Bachelor degree from any accredited university from any member state will technically qualify you to the Masters programs in all member states -- before the Bologna Process some countries didn't even have separate Bachelor and Masters degrees.
The Masters programs themselves are standardized at the level of nominal effort, but vary wildly in topics offered, actual effort required, and quality of the teaching. No standardization can change the fact that the top people will seek out the top universities, which then heavily skews the skill distribution between the universities.
The structure of the programs is standardized in the EU, not the quality of the programs.
That method seems to encourage superficial understanding unless you're willing to devote a large chunk of your time to working. And by that I mean nearly all day. There is a strong culture of pushing those boundaries. For me, MIT helped me realize the importance and value of life/work balance more than anything. I do my best work when self-directed and moving at my own pace - not cramming for exams. But I am just one data point. I'm sure it worked well for others.
Not to say that non-elite universities can't be as rigorous - some certainly are, and some students will be better than others. The brand is definitely a part of it too.
This is stated very well. I will be using this verbiage (specifically the "expose sloppy learning" part) in the future in my occasional defense of hard tests.
As someone who went to both a very average university and a top ranked school, I fully agree. There were only two differences between the two:
1. The top university packed more material in a course (not a good thing, in my experience).
2. The top university had more peers which drive you to be better. This I really felt a lack of in the average university. All the motivation had to come from within, and although it was somewhat beneficial, it was also very aimless and inefficient. Synergy really is a thing.
(As an aside, the quality of instruction was not very different - perhaps slightly better in the average university. In my experience the role your peers play is much more impactful than the role your teachers play).
"One answer to this question would be following: One learns a lot more when taking calculus from someone who is doing research in mathematical analysis than from someone who has never published a word on the subject. [...] But this is not the answer; some teachers who have never done any research are much better at conveying the ideas of calculus than the most brilliant mathematicians.
"What matters most is the ambiance in which the course is taught; a gifted student will thrive in the company of other gifted students. An MIT undergraduate will be challenged by the level of proficiency that is expected of everyone at MIT, students and faculty. The expectation of high standards is unconsciously absorbed and adopted by the students, and they carry it with them for life."
I'm going to look funny at the claim of "expectation of high standards"; I suspect he's delusional.
But there is, in my experience, a considerable difference between "someone who is doing research" and "someone who has never published a word on the subject"; I've never met any "teachers who have never done any research [that] are much better at conveying the ideas." Further, the "ambiance" bit is right; being surrounded by people who are deeply interested in a topic beats the pants off people who are just there to get a grade.
Just looking at the curriculum doesn't really tell you anything useful. How well the material is taught and to what depth matters much more. Something simple like "learn matrix multiplication and determinants" could either be covered in 45 minutes or take up several lectures going deeper and deeper into the subtleties of what those things actually mean on a fundamental mathematical level.
Then there is of course the personal additions a lecturer can add to each course beyond the curriculum, which in many cases can be more interesting and educational than anything on the curriculum itself. Basically you can have a dozen lecturers covering the same curriculum using the same textbook and get a dozen very different educational experiences. You can probably, to a lesser extent, even have the same lecturer lecture to a dozen different groups of students and get a dozen different outcomes.
That being said I have no idea if MIT is actually better in this regard
In my view, the ultimate proof as to whether you deeply understand something is when you can engage in rigorous synthesis using that thing, or correctly explain it to others without using terminology as a crutch.
Getting to that level of understanding is not merely a function of the recognized accomplishment level of your peers or your instructor. What imbues people with a deep understanding varies from person to person, and many really benefit from a deliberative, instruction and discussion/conversation-focused instructor. My casual observation of MIT is that the level of instruction can be far from the best I've seen, because the students are so bright or so fast that they don't need much from the instructor in order to immediately see the conceptual space.
Imagine however that you are no intellectual slouch, but maybe are a more contemplative and deliberate thinker. You may be penalized in an environment like this, which bears no representation on whether or not you can get to the same quality and depth of understanding as your peers. I once had an MIT-educated professor say, "I know how to grade these exams. My 'A' students will finish the exam in the time allotted, my 'B' students will finish most of the exam, the 'C' students less so." This infuriated me then, and infuriates me now. The counterpoint was a (well-known) UMich-educated professor, one of the best I had, that gave challenging exams with no practical timelimit. He inspired me to be very ambitious as the lead on a free-form semester group project, so much so that it was clear we wouldn't finish during the semester. He saw how ambitious it was, and gave us an incomplete until we could turn it in the next semester. We did, and it was a really nice body of work.
I've seen similar patterns in the rest of my education. I've always gotten the most not from the best-credentialed professors, but from those who themselves understand deeply and make the material and themselves open to the interested learner.
This is ultimately what teaching and learning should be about, and can easily take place at any good institution, not just the MITs of the world. In my field of engineering, there is no significant correlation between those who were educated at MIT or Stanford, and the impact of their contributions to what is in the field and has truly advanced the state-of-the-art.
My experience at an okay school was that all upper level classes were graded and taught to scale with the student's intelligence. I and many other students passed through the curve. Many of us would not have passed comparable (by name and subject matter) courses at the top schools with a different curve and more in depth material.
>a gifted student will thrive in the company of other gifted students
This I put a little more stock in. It is competitive to get in. So all of the people in your calculus one class had calculus in high school, and they are all particularly gifted at math. This means they are able to cover more, and go deeper in to the subject because there is no time "wasted" educating an "average" student.
Even in a class of motivated and gifted learners, there are some learners who are more motivated and/or gifted.
One of the most frustrating experiences for me at Ivy schools was having to slow the discussion down for the less-talented folks. Don't get me wrong, they were still (mostly) smart and knowledgeable, but there was a palpable difference in discussions that accommodated the slower people and those that didn't (e.g., office hours, some seminars, informal round tables, etc.).
In short, and imho, there is still time "wasted" educating an "average" student at these schools, it's just that the "average" is usually quite a bit higher.
You can take it either way. It's hard, but not in a way that makes you better at the subject.
Maybe there's a deeper reason it's good, but I'm skeptical.
>MIT students often complain of being overworked, and they are right. When I look at the schedules of courses my advisees propose at the beginning of each term, I wonder how they can contemplate that much work. My workload was nothing like that when I was an undergraduate.
Ok. How is that at all healthy, not just medically but academically? There are limits, and when you push too far past them, you reduce long-term performance. The human machine operates within engineering tolerances like anything else.
Dirty little secret -- no more than 30% of elite school grads are really "smart" (here meaning they can grasp, apply, and extend complex new ideas quickly). The others are usually bringing in some sort of social capital (athlete, highly skilled and aggressive hoop jumper, etc.).
One thing I try to tell people about elite school grads is to set the expectations early and clearly. The really smart graduates get bored really quickly, and if you have routine work for them, it likely will not work out well. That said, the hoop jumper grads may be exactly what you need as long as you have a long series of hoops. The athletes have their strengths and weaknesses, too (n.b. the elite school varsity athlete totally-not-a-club club is a thing, and that thing has value in the right contexts).
Surely that was clear enough?
That means that when a person is fired, the stated reason for that firing is not always the same as the real reason.
You can argue that with Physics as well. However, I have learned in my life, that there is a value in not having the option to "go back". Flexibility comes at a price.
I hear people say that about mathematicians, physicists and even philosophers, yet I've never come across any of them in my career. I suspect it's wishful thinking.
Take English majors as a more straightforward example. Virtually no English majors are employed doing anything directly related to their studies.
My perception is that it was easier to cross disciplines when I got out of school, more than a couple decades ago. Things like "embedded systems" weren't well enough defined to have an established training-to-job career path, so it was easier to break into those fields. Rejecting candidates who didn't match a template was only beginning to be automated. Today, bigger employers are looking for training that is more specific to the task at hand.
So if you're in a rapidly emerging area where workers tend to be younger, you might see fewer cross-disciplinary people. At my workplace, the people with "generalist" background who are now working in specialized areas of engineering, tend to be over 50, myself included.
There's also a relatively small number of people majoring in math compared to many of the fields the math majors switch into.
My current team of SWE's at BigCorp for a while was >50% phds; I did a math postdoc, we had another MIT math phd, and a physics, bio and EE phd off the top of my head. So I guess it depends what you're working on?
I don't think this is particularly controvertial.
I think it's more common among physicists.
Branching over into computer science is common.
As is economics and engineering.
I rarely see a non-math PhD as a faculty in the math department. Same with physics. However, it is not at all unusual to see a math/physics PhD as a faculty member in another department.
I've been looking for a way to express this idea for years. Doing a core subject at University like Mathematics or Physics can be used in a million different careers.
In a similar way I'd recommend learning say Functional Programming over React or Scala.
Plus I guess learning Category Theory over functional programming too, although I'm not quite sure how true that is.
There is a level of abstraction which for most people is the wrong end of the stick to start from. Category theory is one of them. (and I'm a big fan of category theory and Haskell).
Category theory may occasionally lead to elegant, orthogonal and consistent solutions, especially if you are building new abstract tooling (LINQ in C# is a good example). It gives you tools to reason about structure.
However in most situations, it doesn't give you anything you can directly use. Most of them time it serves to clarify and enlighten rather than generate. Categorical thinking  may be more useful than the actual theory.
If you seek to apply category theory without first getting your hands dirty with conventional "inefficient" programming, you will end up with abstruse solutions that are potentially overly complicated and unmaintainable. Applying theory without the requisite real-world experience (to temper the tendency toward heavy abstraction) is a problem of youth. The world is not full of highly-intelligent people who can work with highly abstract code. One needs to learn to work with the inelegance in the real world and with other people in the system. A larger coherent system, though weak in parts, may produce a better result than a few bright people.
The horror! Maybe this was truly a terrible fate in 1997.
> Scientific biographies often fail to give a realistic description of personality, and thereby create a false idea of scientific work.
Any recommendations for biographies which give a realistic look into famous scientists' lives? I am reading the Einstein biography by Walter Isaacson and it's pretty good so far.
Completely off topic : how do you configure a LAN to have <domain>/~username urls (such as the linked post's url) exposed to the internet? I remember having such a directory in my school's linux network where I could place files in public or public_html (can't remember)and other users could access it by going to <internal school ip>/~myname, but it wasn't exposed to the internet.
Very in-depth book about a naturalist who deserves to be much better known than he generally is. It also includes several chapters with mini-biographies of famous people who were influenced by Humboldt.
Walter Isaacson also just came out with a biography on da Vinci and he has one on Benjamin Franklin, though I never finished it, it was written to the same standard as Einstein.
For me, the hidden curriculum is in machine learning and AI. I was recently given a chance (as an undergrad) to join such company. I think this is my best chance to learn about the field in a qualitative way and possibly get my name on a research paper before I graduate with a BSc. I'd work under the supervision of Ph.D.'s in the field.
That being said, for my grad school efforts, I would need to keep my GPA in check, which is currently 3.2, but which would suffer a blow. My question is that does anyone have an idea whether admission boards (in private US colleges) tend to tolerate lower GPAs for whatever hidden curriculum I've found, given it's still academic and aligned with the actual degree I'm applying to? I'd be applying as an international student.
There isn't a single Beethoven scholar I can think of who seriously entertains the idea of Beethoven being a "self-generating" genius who never made mistakes.
Also, the Beethoven pieces that today's composers most admire-- the late string quartets-- were almost universally shunned by actual Romantic period commentators.
Perhaps the Romantic Age stereotype is demoralizing to students because professors with a narrow domain expertise don't actively seek out music history experts to revise their outdated views about the Romantic Age.
As well as anyone who has played a piece by Beethoven or read a children's book about him. The Beethoven stereotype is the impassioned if erratic genius who literally ripped through the paper correcting his own errors in a quest for perfection, persisting in his struggle to write masterworks even after going deaf.
The author is almost certainly confusing Beethoven's myth with the Mozart myth-- the child musical prodigy through whom Christ's perfection sounded.
This matters because the author is attempting to make a causal relationship between Romantic-era genius myth-making and higher education in the U.S. at present. If he can't even match the myth with the right composer I think I'm right to be skeptical of his theory.
[Prof Rota was a bit of an institution already in the 80s, when I took his probability course 18.313. The course utterly kicked my butt, and I believe him when he writes that the homework would occasionally lead to publications.]
One of the most cutting arguments I've heard from liberal arts education advocates is that STEM extremists want to turn college into a high-level trade school.
Seems that after Gian-Carlo Rota died, the class is defunct. The closet one can get is 18.175, but it still presents material from purely theoretical standpoint (e.g. no application to physics to CS).
While 18.03 is not core, it is a requirement for most engineering disciplines at MIT.
"18.30, differential equation, the largest mathematics course at MIT, with more than 300 students" [ths might be the course that is now 18.03]
"18.313, a course I teach in advanced probability theory"
It's interesting to wonder if, 20 years later, the author would change some of what he originally wrote.
No offense intended; I'm self-taught as well and can appreciate the sentiment immensely. But damn do I wish I made a few different decisions during high school. I'm ~1 year into my web development career and I'm concerned about what happens when I get bored of building APIs and web apps -- I seem to have a talent for picking up new skills pretty quickly, but I'm afraid my lack of formal math and algorithms training is going to limit the career trajectory of self-taught programmers like me. That's without considering the ramifications of an irrelevant college degree.
All I'm trying to say is it's never so black and white, and while I don't plan on using the lack of a degree against somebody, I certainly wouldn't call it a red flag either.
Put in other terms, when you have to play cowboy on a production server, you take a backup first to hedge your bets; otherwise you aren't brave, you're just foolish.
"Over the past decade, the university’s student suicide rate has been 10.2 per 100,000 students, according to a Globe review of public records as well as university and media reports. More recently MIT’s suicide rate has been even higher; over the past five years the campus has reported 12.5 suicides per 100,000 students.
"The increasing rate has been driven by the school’s undergraduate population, whose suicide rate in the past decade has outpaced that of the school’s graduate students — 12.6 to 8.5.
"The national average for college campuses is roughly between 6.5 and 7.5 suicides per 100,000 students, according to three major studies that looked at undergraduate and graduate student suicides from 1980 to 2009."
On the other hand, over across the street at that other school,
"At least one other local school, Harvard University, has an above average rate for its undergraduate population. The rate of suicide at Harvard was 11.8 per 100,000 undergraduates over the past decade. When accounting for both undergraduates and graduates, the rate at Harvard was 5.4 per 100,000."
So maybe it's Boston.
"...the university’s undergraduate rate has declined, particularly since the early 2000s. Between 1994 and 2005, the undergraduate suicide rate at MIT was 18.7 per 100,000."
Edit: And I just noticed the "Ten Lessons" were written in 1997.