Hacker News new | comments | show | ask | jobs | submit login
A successful strategy to get college students thinking critically (arstechnica.com)
74 points by shawndumas on Aug 18, 2015 | hide | past | web | favorite | 38 comments



Here is a simple definition of critical thinking: Be critical of what is being said until it can be validated tangibly. It only has two requirements. The courage to question what is in front of you, and the tools to test and validate statements objectively.

What people trip on is often the "thinking" part. It plays on the myth that thought and logic is what solves problems, but in reality we need to be able to refer to something tangible -- such as evidence. If you don't have enough information, stop thinking and do your research (or experiments). Most evidence is rather obvious once found.

Anecdotally, fear and anti-science are the enemy of critical thought, and those who claim moral authority and depend on spin to achieve their goals have used these weapons time and time again.

"They don't want a population of citizens capable of critical thinking."[0]

-- George Carlin

Amen.

[0]https://www.youtube.com/watch?v=rsL6mKxtOlQ


The most important part of that process, though, is noticing that you don't have enough information. That's a thought process.


The most important process is acquiring evidence. You either have evidence or you don't. It's an observation more than anything.

Though if observing what you know counts as thought, then sure (but I'd still rather reserve the term "thought process" for more difficult problems).


What I was getting at is that a lot of people won't even get as far as making that observation. I think that's the biggest obstacle to critical thinking.

More directly related to your original point, it's quite easy to think that the tangible evidence you have is sufficient, when in fact it doesn't prove as much as you think it does. That's another key place where logic comes in to critical thinking, before the evidence-gathering phase.


> The most important process is acquiring evidence.

Neither evidence nor thought are useful without the other. But people are more likely to be deficient in a manner correctable by training at processing sense data to produce useful and correct conclusions (thought) than they are to be at merely receiving sense data (acquiring evidence as distinct from the thought process of determining what it is -- and is not -- evidence of.)

> Though if observing what you know counts as thought, then sure

Determing what you can reasonably conclude (and what you cannot reasonably conclude) from evidence is not "observing" anything, it is reasoning -- thought.


Yes, and Yes (hence the "sure").

But what is misleading is that we're often led to believe that what we think or believe is somehow relevant or even important in light of valid objective statements and observations. It's important but it's separate, and it's truly only important to us (the importance of our own thoughts are personal). Evidence, once internalized, is analyzed through introspection, and hence internally maintaining the subjective-objective distinction requires discipline and practice. Without this distinction, it turns out we can quite effortlessly reason our way to most conclusions given any evidence, and this activity is mainly driven by our intentions and emotions. Herein lies the dangers of putting too much un-founded weight on thought. But when we let evidence determine the weight distribution of thoughts, we avoid this problem. It also turns out that good evidence is obvious, making it easy to remember and requiring little thought.

Sherlock Holmes is my favorite example. His breakthroughs come primarily from observations and connecting dots (insight) [0], and not from deep questioning or deductive logic. Though important, those aren't actually that hard or productive or entertaining.

Evidence is the originator, and the tangible of the two. Our imagination may take us anywhere (Einstein), but without evidence we'd get nowhere (scientific method).

[0] Holmesian deduction consists primarily of observation-based inferences. https://en.wikipedia.org/wiki/Sherlock_Holmes#Holmesian_dedu...


A good rule of thumb for evaluating the merit of groups (political factions, interest groups, whatever) is whether or not they encourage critical thinking (education).


By that measure, "education" includes many of the worst offenders. The most intensely political of the departments on campus tend to be the ones who talk most about "critical thinking". The loudest proponents of "critical thinking" tend to be the most ideologically monocultural. They tend to be the most adamant supporters of "speech codes" designed to silence their critics. They tend to be ones who most closely scrutinize others for ever smaller deviations ("micro aggression! he used the wrong word!") from the enforced, orthodox dogma before going back to class to teach their students about "critical thinking".

The Marxist "English" professor who tells you that what she wants most from your paper is "evidence of critical thinking" means she will judge your paper on how persuasively it argues in favor of her political agenda. Real-world evidence that casts doubt on Marxist theory will most likely be punished and labeled "lack of critical thinking".

The social justice warriors in the ed school go on endlessly about their commitment to "critical thinking" and its supposed superiority to "mere facts", yet nobody in electrical engineering ever uses the term, "mere facts". Nobody in the physics department who is trying to make sense of puzzling data from some space probe ever consults the critical thinkers in the ed school to get the benefit of their better-trained reasoning powers.

When Larry Summers, president of Harvard, suggested that a natural, statistical phenomenon (fatter right tail due to greater variance, not higher mean) might be a major reason for more male than female math professors, he lost his job. In attempting to defend himself, he clearly felt that evidence of his political conformity ("I'm a lifelong liberal with a long record of...") mattered more to the decision makers than empirical evidence of the phenomenon. If the top leadership at Harvard, arguably the world's leading educational institution, believes that other top leadership at Harvard is more persuaded by argument from political identity than argument from empirical evidence, what is it they are really promoting when they so vociferously promote "critical thinking"?


I've always thought that STEM instilled critical thinking skills far better than a liberal arts major. The reason is you can't fool mother nature. If your airplane design doesn't fly, no amount of witty rhetoric, mental gymnastics or self-delusion is going to get it off the ground. You have to face reality.


> The Marxist "English" professor who tells you that what she wants most from your paper is "evidence of critical thinking" means she will judge your paper on how persuasively it argues in favor of her political agenda.

Or at least against capitalism, democracy, and/or traditional western civilization.


Do you honestly not understand why it is problematic for the president of Harvard to try to explain away a political issue he is in charge of and has no business trying to explain away?

I am also amused how salty you are about leftist academics when the very same argument (which isn't very substantial and mostly tiresome) could be rephrased as an assault on economics, which gleefully assume ridiculous things about societies in order to prove liberal or right wing ideas that also happen to be allied to power. But no, economics is completey apolitical right?

Larry Summers was being completely apolitical and acting appropriately in his role as president of Harvard in making that remark. The reason why he came under fire is clear to me, but it's not clear to you. I don't think I could explain it to you. Your post drips with contempt for a certain class of people you do not name but characterize in the same way other unsavory ideologues do.

> yet nobody in electrical engineering ever uses the term

That's probably because electrical engineering is enormously easier to understand than more complicated disciplines.

> Nobody in the physics department who is trying to make sense of puzzling data from some space probe ever consults the critical thinkers in the ed school to get the benefit of their better-trained reasoning powers.

And yet the theoretical physicists who try to push the boundaries of what we understand to be possible quite often employ thinking that has a deep relationship with the processes that underlie more philosophical disciplines. Not all of life or even science is reading bits streaming in from the outer solar system.

And before my ability to perform quantitative or critical reasoning is impugned, I majored in mathematics and ended up in a discipline that hasn't quite been hammered down the way, say EE has been. Somehow I survived such an education without becoming so mean-spirited towards people who study far more difficult ideas than spinors and circuit diagrams.

> In attempting to defend himself, he clearly felt that evidence of his political conformity ("I'm a lifelong liberal with a long record of...") mattered more to the decision makers than empirical evidence of the phenomenon. If the top leadership at Harvard, arguably the world's leading educational institution, believes that other top leadership at Harvard is more persuaded by argument from political identity than argument from empirical evidence, what is it they are really promoting when they so vociferously promote "critical thinking"?

Maybe he's not so good at thinking his way out of putting his shoe in his mouth. Authority does not make you good at everything. Neither does prestige in one particular field.


After hearing the kind of bullshit philosophers routinely say about relativity or quantum mechanics, I'm not sold on the idea that philosophy is important to science. Moreover, since we often see the same philosophers talking about hard sciences and about society (and drawing breathtakingly stupid parallels between the two), I expect their ideas about society to be rubbish as well. In fact, since their social ideas are harder to verify definitively, they are more likely to be rubbish. Though I grant you that philosophy is really complicated, in a pointless kind of way.


I'm not even sure what you mean. How can one discipline be more correct than another when they ask different questions?

I doubt that you think that philosophy is pointless. Something I can believe is that you think that the academic and professional institutions of philosophy are pointless. That sounds plausible. To reject philosophy itself? That is to reject the means by which you can even justify the sciences, down to the most quantitative and most successful theories. That is to reject the heritage of human thought that grapples with reality and the mind.


In what state of mind does a person decide that looking at a toaster isn't enough justification for the sciences, and you also need to read philosophers like Sandra "Newton's Principia was a rape manual" Harding?


Science is not the only thing that goes into a toaster. You are ignoring the enormous social institutions that go into modern society and economy. Those social institutions, in fact, have been the subject of intense thought for millenia, and the institutions you have now are rooted in relatively recent philosophical investigations.

And eventually they will be supplanted, and their successors will be rooted in philosophy that is perhaps being done today.

Also, your statement is a terrible piece of rhetoric. Do I have to spell out why it's not a really compelling rhetorical question?


Well, you did ask about justifying the sciences, not the other things that go into a toaster.

As for social institutions, I'm not convinced that the influence of philosophers has been net positive. But that seems like a politically fraught topic. Maybe you could name some uncontroversially good social changes that came from philosophy?


> Well, you did ask about justifying the sciences, not the other things that go into a toaster.

If those other things didn't go into the toaster, science would not make a toaster. Science is useful, but its usefulness is tied up in the rest of human endeavor. Science is a human process. It is good at what it does. Its goodness enriches us because of how it fits into everything else we do.

> Maybe you could name some uncontroversially good social changes that came from philosophy?

I think most people would consider Enlightenment ideals much, much better than what came before them. But I think we're starting to circle around where we are disconnected. You are threading a narrative of progress, of milestones; you're marking human development with milestones where things become better.

I am very much more concerned with the process of coming to and going beyond particular advancements. Philosophy is important because it is a rich store of human experience dealing with human experience, human methods. For example, philosophical methods were and are employed in figuring out what probability is. The history of probability is very rich with ideas. How we come to modern probability is interesting, and it is inseparable from the philosophical investigations of probability.

Really, it doesn't matter whether you or I respects philosophy or deploys it in our lives. Its usefulness or not is what it is. All I'm saying is, you're missing out if you're discarding it. At the very least, if you were to become versed in philosophy and still decided it wasn't worth much, you would have a much better grasp of what science is; your facility with scientific methods would be improved at a minimum.


I think there's a bit of bait and switch involved. Natural philosophy (Descartes, Leibniz, Laplace...) was indeed very important to science, and it's no accident that the leading proponents were great scientists themselves. I'm certainly not discarding their philosophical ideas, even the weird ones like monads, because they turned out to be illuminating once and might do so again. On the other hand, modern philosophy is mostly derived from social thinkers (Locke, Hobbes...) who have repeatedly tried and failed to leave any mark on science. That, IMO, wasn't an accident either.

The differences in attitude between the two original groups (rationalism vs empiricism) were very deep and interesting, though maybe a bit outside the scope of this comment. But if you simplify and fast forward to today, you get the modern "science vs philosophy" conflict, in which I have a clear favored side. On one hand, you have folks who believe in the power of the individual reasoning mind, and put a man on the Moon. On the other hand, you have folks who claim that all understanding is social, and cannot explain how a toaster works. (But they sure have a lot of ideas about society!)


There is a flip side to the coin of critical thinking, which is to build those critical thoughts back up into something positive and productive.

It is actually quite easy to teach people to deconstruct the ideas presented to them. But if you are not also taught when it is time to reconstruct a newer, better idea, then you are only half-educated. Without drawing upon critiques to rebuild an improved conclusion, it comes off as simple negativity.

From a business perspective, it is the reason behind the cliche of "Don't bring me a problem, bring me a solution."


> It plays on the myth that thought and logic is what solves problems, but in reality we need to be able to refer to something tangible -- such as evidence.

The fool believes what he thinks, the wise man believe what he sees.


That wise man is going to be fooled a lot; humans are easily tricked about what we think we see, and our memories about what we saw are not very good. Believing what we see might be a bit foolish.


You're sorta missing the point. People are good at believing what they think in the face of overwhelming evidence otherwise - what they see.

And you're right, humans are tricked by what they see because their thinking gets in the way. The goal is to simply see things as they are.

In programming I see it commonly in the statement 'there is no way this commit broke something completely unrelated', yet the build worked prior to the commit and does not work after. Could there have been a coincidental break? Sure, but not often. It does not matter what I think in this case, only that I can see the build worked and now it doesn't.


It can't just be information presented to you. You also need to apply critical thinking to your own assumptions and thought processes. There are tons of cognitive biases you can attribute to a lack of critical thinking applied to oneself:

The status-quo bias, functional fixedness, anchoring, the endowment effect etc.


> It plays on the myth that thought and logic is what solves problems

That's not a myth. Thought and logic is what solves problems.

> but in reality we need to be able to refer to something tangible

Sure, but we need to apply thought properly to something tangible to solve problems. Without thought, observing something tangible doesn't solve any problem.

> If you don't have enough information, stop thinking and do your research

Determining what you need to know to address the problem you have, and determining what you can know from the evidence you have is done through thought. That's how you determine if you have enough evidence or need to do more research, and, in the latter case, what additional research you need to do.


Thinking is what brains do. Brains can be arbitrarily wrong. They are often very wrong. They arrive at being wrong by thinking. They can stubbornly remain wrong through thought. Lots and lots of bad thinking happens. Clearly, something else is essential in order to make thought good.

You can think and logic any position. People who think otherwise are either explicitly or implicitly rejecting some mode of thought or logic. That's fine, but how do they arrive at this rejection, and how do they know if it's a good or bad rejection?

Since they are thinking, and the people who embrace those bad modes of thought and logic are also thinking, clearly it's not just thinking that solves things. If a brain is present then thought is present. That's all that can be said about brains and thought without appealing to more.

Does a brain even need to be present in order for a problem to be solved? Maybe thought isn't necessary at all. But we are part brain, so we have to figure out what good and bad thinking is, so that we can be better brains and not worse.


A friend of mine was attending Xavier, and one of his professors was also an old-school Jesuit priest. My friend told me that he this professor started to tell outrageous lies, just so he could stop the class and ask why no one was questioning him. Really, the purpose of that lecture was to set the stage so he could talk about critical thinking and telling truth to power.

There's a Doonesbury about a college-prof character that's something like this. (Except the moral of the story never gets across to the students.)


N.G. Holmes participated in this study as a doctoral candidate. His thesis for partial fulfillment of the PhD program was on this topic. Note tl;dr https://circle.ubc.ca/bitstream/handle/2429/51363/ubc_2015_f...


(here's my $0.02 in no particular order)

"Interestingly, the researchers tracked the same students into the sophomore physics course that a third of the freshmen had advanced into. Even there, they still saw improvements, despite the fact that none of the critical thinking instructions were repeated in that course."

I think "Interestingly" should have been "predictably."

A mechanical engineer, lawyer, physicist and EMT encounter a dilapidated stairwell railing and all have very different thoughts about it. People everything through the lens of the intelligence they've acquired throughout their lives.

"Critical thinking" just makes a better buzzword than "applying stuff that's been learned in different contexts".

Physics to physics is a pretty low hanging fruit for studying this kind of effect (which isn't a bad thing). Math to physics or economics to history would be a more interesting point to study. For example, the history of North America from 1600-1700 is much more interesting to study if you know about the economics of Europe at the time which itself is much more interesting if you know that chunk of European history. Knowing physics makes studying basic calculus much more interesting and beneficial and vise-versa .


"Critical thinking" is a term with more specific meaning than just "apply what you know".

Critical thinking means asking yourself how you know what you think you know, mistrusting casual intuition, and examining how your own thinking process could lead you to a wrong conclusion.


It is a low bar, but it's still one that is not consistently hurdled by most education programs. So many students just want to get their A grades and then purge the entire semester from their memories. (Or maybe they don't explicitly want this, but it's what they get by cramming instead of learning.)


My highschool physics class was set up like this. Every two or three weeks we had to perform a lab (with real, physical tools) and take measurements, then write up our findings. This meant including full derivations of the physics behind the lab, as well as analysis of the data itself. Most essentially, our teacher (if you're reading this, hi Mr. Schwartz!) graded harshly, and forced us to consider all potential reasons our data might have deviated from the theoretical outcome. Needless to say we all learned a lot, and everyone I still talk to who took that class in highschool has only positive things to say about its effect on their approach to learning. It was strange re-taking the course in college without having to perform any of the experiments; if I hadn't covered most of the material already I'm not sure I would have been able to pass.


My primary complaint with the article is that it makes the implicit connection that "critical thinking" is something that tied to science.

Good science needs critical thinking, true. Dealing with life ALSO needs critical thinking, and I suspect that's a more direct impact to the average person. (Witness your most/least favorite political discussion, see any bit of TV news, heck, the recent Amazon brouhaha, and see what people not having enough critical thinking does to life)

College is quite late to try and teach* the skill, but better late than never.

I can think of only one class I ever had that I felt covered critical thinking. It was a junior high school home ec class, and for one single class period, they passed out magazines, had us pick out ads, and then list at least 5 ways these ads manipulated us. Turns out 5 is a significant number: to say wealth, sex, and happiness feature in the ads is easy...but 4 & 5? That requires thought. Thought about how we are being misled, how there is something non-obvious but true. To this day one of the lessons sticks in my mind. "Bayer: 4 out of 5 doctors recommend" But who picks the 5 doctors? It was mind-blowing to young me, and I think it filters in to how I consider information I'm getting to this day.

*Teaching critical thinking isn't really teaching, so much as repeatedly providing the opportunity and motivation and hoping for the lights to turn on. It's the best system we've got, and we don't do it enough, but I'll acknowledge that it is harder than teaching a more passive, observable skill.


Media literacy is one of the most underappreciated subjects in school these days. I remember only one or two classes ever teaching this kind of critical thinking. Put up against the $50b advertising industry, is it any wonder why young people make such bad economic decisions?


And financial literacy!

I think the bottom line is that you need to take almost everything everyone says with a grain of salt.


I really like this idea, but honestly, isn't it basically a reformulation of the Socratic method? A really good teacher will naturally use dialectical techniques to engage their students. Is this just some way to structure that in a way that can be measured?


I've got 2 years of physic & electronic exercises on CS university course (they had too much physic teachers and too few CS teachers :) so why not). It was fun, but the thing that stuck with me the most was how rarely the results agreed with theory.

Usually it was off by orders of magnitude (and we had access to the laboratory for 45 mintues each week so once you've done your experiments you had to stick to the data you've got). Usually it turned out at home that the data makes no sense.

I don't think it was useful for my job, but it certainly made me appreciate practical physics more.


When I did my lower division physics coursework our instructor relentlessly browbeat us about our data, and whether or not it was good. This was to the point that he taught us a separate data and error analysis course at the same time.


That description of what the teachers did differently has to be one of the most vague descriptions I have ever read. Can anybody do a decent job of articulating what they actually did that worked so well?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: