Hacker News new | past | comments | ask | show | jobs | submit login
An alarm about the influence of standardized tests on American society (scientificamerican.com)
143 points by champagnepapi on June 2, 2017 | hide | past | favorite | 191 comments



Standardized tests are a lousy way of deciding things. But at least they're better than non-standardized tests.

At my university (SFU, in Canada) admission to most faculties is based on high school grades. That's grades awarded by high schools -- we used to have province-wide grade 12 exams, but for almost all subjects those were abolished a few years ago. It's widely recognized at the post-secondary level that a 90% grade from some schools means much more than a 90% grade from some other schools; but if anyone is actually adjusting high school grades to account for these differences, they're not admitting it.

Entrance scholarships, and admission in one faculty (Business), is determined partly based on high school grades and partly based on extracurricular activities -- in most cases, volunteer activities. Great, we're bringing in well-rounded students who are dedicated to helping the community, right? No; it turns out that they admit that they're only volunteering in order to pad their resumes -- in some cases because high-priced consultants hired by their parents told them exactly what volunteering they should do and what to say about it when they're writing their scholarship applications. (I've been lobbying for years to get an analysis done of how the rating of student extracurricular activities correlates with the median income of their postal code. I'm still waiting.)

The SAT become popular because it could identify smart kids who would otherwise have been overlooked due to their race, gender, religion, or class. Is it perfect? Hell no. But as we discard standardized tests, I worry that we're moving back to a world where university admission will depend less on aptitude and more on having parents who have the money and the connections to get you the "right" extracurricular experience.


Yeah, I don't think Sternberg is advocating for ending standardized testing:

"My colleagues and I developed assessments for creativity, common sense and wisdom... we doubled prediction [accuracy] for academic performance, and with Kaleidoscope we could predict the quality of extracurricular performance, which the SAT doesn’t do."

It sounds like he just wants to reform them, and has some strong initial results to suggest that we could do much better.


Any metric can be gamed. It only works until people figure out how.


No one has figured out how to game well designed IQ tests. That is the appeal of high g-loaded tests like the SAT.


The trick is to change your metrics often enough, so people can never be sure they gamed the right one.


"You mean I spent a hundred hours volunteering at the soup kitchen for nothing‽"


Perfect!


See also: once a metric becomes known, it ceases to be reliable.


>I worry that we're moving back to a world where university admission will depend less on aptitude and more on having parents who have the money and the connections to get you the "right" extracurricular experience.

We already moved back to this system and have done so for a long time, in my opinion.

There's also a second round of this phenomenon with unpaid internships.


I knew a classmate who openly admitted to putting made up volunteer activities on her college applications, arguing there was no chance the school would fact check her. She was right!


The problem with these is that they couldn't have been significant enough to make a dent in the outcome. Pretty much all significant achievements are fact-checkable. E.g. you may falsely claim that you were doing karate or basketball, but if you did win any trophies, then most selective schools wouldn't care about you.


You might be surprised. I was on the committee which adjudicated major entrance scholarships for six years, and I can count on my fingers the number of times someone said "can we get some verification of this?" We've never thought that fraud was a common problem, and when you're competing to recruit the top students you really don't want to go back to them and accuse them of dishonesty..

And (although we're revising the system and hopefully changing this part) the relative weighting of "significant achievements" of an easily fact-checked nature vs. "doing lots of stuff" has skewed heavily towards the latter; you could have a resume full of "volunteered at X" and "leader of student club Y" which would be worth as much as the difference between a 90% high school average and a 95% high school average, while containing nothing which would show up online or elicit commentary from a referee beyond "this student is heavily involved".

(We're putting together a new system starting next year which is aimed to be more about identifying students who are in some way exceptional as opposed to merely being generally all-round good people. A large part of the motivation for this is to avoid easily "gamed" metrics; nobody spends 15 years playing violin or becomes a national chess master simply because it will look good on their resume. So with luck we'll end up having less of a fact-checking problem in the future; but we're still going to have a significant amount of trust built into the system.)


A few years ago, I was mentoring at the local (major!) university and met a student who was the President of a club. Then a week later, met another student who was the President of the same club. A few weeks later, the same.

It turns out, a bunch of students got together, created a club, and each member was called a "President." The best part is that if an employer ever checked, any member could honestly say "yes, he's a President in that club."


I would hire all of them.


And make each one CEO.


I never really understood why admissions committees give any credit whatsoever to 'well-rounded' students. If somebody spent all their time volunteering in 10 different inconsequential clubs, playing 5 musical instruments, and participating in 3 different sports, I would consider them a 'tourist' or a dabbler, if you will, and wouldn't give them any credit. Now if they happened to win a championship, get a gold medal, or qualify for the Olympics, etc., then that's an entirely different story.


You're pretty much preaching to the choir here; but the position I hear most often is that when we're selecting the students we want to have as part of the university community, we should pick students who will contribute something to said community (beyond "mere" academics).

In some cases, I see students' extracurricular activities as demonstrating an ability to dedicate themselves to a cause or pursuit; that sort of perseverance is important in higher education. But in most cases that sort of dedication goes along with extraordinary success of the sorts you mention. (My personal scholarship assessment rubric actually includes a specific value for "competed in the Olympics or equivalent level of international competition".)


What's wrong with doing a bunch of stuff instead of being the best? These kids are in high school, maybe they just wanna try out a bunch of clubs and see what interests them in college and beyond.


There is nothing wrong with that, but why should trying out a bunch of stuff make someone more eligible for a scholarship?


Why shold being good at violin or sport give you non musical scholarship?


Why shouldn't it?


Doing a bunch of stuff in high school is a pretty big indicator that you are from an affluent school and/or an affluent background. The rest of us had after school jobs or chores. Never mind the parents that can arrange those unpaid internships. As to after school clubs, we didn't have any because we didn't have the money to pay teachers to stay around.

If a school actually care about the diversity of its students, then judging on these activities self selects a much narrower socioeconomic background.


I'm more impressed with the kid who came from a broken home in a terrible school district who managed to graduate and do well than I am with the kid who had every advantage but was also involved with the student politics and choir (or whatever).


Indeed. Out of six years of reading scholarship applications, the most impressive extracurricular activity I've read about was working at McDonalds -- because it was clear that they had been given nothing in life for free, and had spent years working every available hour trying to put together money to cover the cost of their education.

That demonstrates commitment in a way that volunteering for worthy causes never will.


I'm super glad to hear someone in higher education say this. Is it a common opinion in admissions circles these days?

(By the by, I'm a huge fan of your work. It inspired me to get into compression several years ago and it's been a passion project ever since.)


The notion of "take students' background into account when evaluating their extracurricular experience" is very widespread. There are some students whose applications discuss volunteering in far-flung parts of the world, and that gets boiled down to "ok, so this kid comes from a rich family..." and doesn't tend to yield positive results.

But the precise details of how this happens varies from person to person on the adjudication committee at my university, and I'd assume other institutions are similar. Some people opt for demographic indicators (e.g. "this is an indigenous student, and indigenous students tend to have less access to extracurricular activities"); I don't like that approach since I insist on evaluating individuals as individuals rather than as members of collectives. On the other hand, I'm probably unique in how much I look for a narrative and self-awareness; to me, what a student has done is less important than their ability to articulate why they did it (and so "working at McDonalds" by itself didn't count for much, but "working at McDonalds because ... [story about why higher education is important to the student and why they're willing to make sacrifices for it]" was crucial).


I know this is off topic, but I find it really interesting to hear that you put a lot of weight or rely entirely on the narrative, which I assume would mostly boil down to the application essay (unless you also interview candidates in person). I've always thought that a significant number of Ivy League applicants hire admissions consultants and marketing experts to hone their "message" and to help them leverage their background and extracurriculars to build a likeable persona that would result in an acceptance outcome, and I believe the essay would be the primary vehicle to achieve that. Do you have a reliable way to detect when a student is really articulate and when a marketing expert has been hired to write or significantly direct the essay?


" nobody spends 15 years playing violin or becomes a national chess master simply because it will look good on their resume."

I don't understand why is doing doing extracurricular in order to get to good university seen as "gaming". If anything, it is quite responsible behavior.


its a waste of human capital to learn something you know you wont ever enjoy just to get into university.

a system based on such bullshit is immoral.


Due to the volume of applicants and the rise in competitiveness, don't you end up being faced with a glut of exceptional candidates anyway? In which case the selection would still inevitably amount to a draw based on arbitrary criteria or simply the way committee members felt that day.


> but if anyone is actually adjusting high school grades to account for these differences, they're not admitting it.

I am pretty sure this was the standard line for campus tours at the University of Waterloo -- that you should take high school exams at your school and not at some diploma mill, because Waterloo did adjust your entrance average based on what high school you came from, and how those students did once they got to university.

(That said, I also heard that there are so many "perfect" candidates that they're contemplating rejecting students at random in order to hit class size numbers, because scaling up requires more lecture halls and more dorms, and those things take years to build out.)


Sorry, I should have elaborated. My understanding is that Toronto, Waterloo, and a few other top universities in Ontario have had a "black book" of this nature for a long time. I'm not aware of any universities in BC admitting to doing the same thing.


> that you should take high school exams at your school and not at some diploma mill

I have never heard this term applied to a high school before. Is this a thing in Canada?


It's a thing in the US too, especially in high-end athletic circles with traveling teams, etc (the flip side of the argument is that elite athletic prospects can't fit traditional high school schedules into their routines easily).


> At my university (SFU, in Canada) admission to most faculties is based on high school grades

Do you not interview the applicants though? I think the best thing to do is to set a basic requirement for grades from their highschool, then get people in and ask them to take your own tests and give them an interview with one of the teaching staff to make your own judgement about whether they're suitable or not.

Accepting someone to be your student for four years without ever having met them is as crazy as any other problems.


We're a public university admitting over 7,000 students per year. There simply aren't the resources to interview students... not to mention the fact that SFU competes with other institutions and would likely lose a large number of students if we required them to come in for interviews.


if you don't have the resources to interview them how can you have the resources to teach them?


As we all know, people never judge based on looks or accent or verbal word choice. We select for talent even if it's from someone with a different culture and social mileau to us, right?


Well there's no need to be so sarcastic and snarky.

I did say some universities administer their own tests, so that's as objective as a test normally can be (not perfect), and probably more objective than one administered across different schools as people have already said.

I don't think it would be unreasonable to combine that with an interview. Teaching is a personal thing - academics will want to check that they can teach you and you're able to learn from them. The best university interviews I did were like a short teaching session and I came away with an opinion about who I wanted to learn from as much as I presume they had an opinion about who they wanted to spend their time teaching.


> Teaching is a personal thing

May I ask where you had that experience? Throughout my undergraduate degree, most classes had hundreds, some more than thousand students. There was barely any opportunity for individual interactions.


One potential benefit of using high school grades is that it spreads out the opportunity to attend college, over a wider geographic area than just the affluent cities. In my state (Wisconsin), admission to the flagship state college is distributed in some similar way, though I don't know the details.

Now of course we can ask if it's the best way to distribute this resource, and I think it depends to some extent on what you think is the mission of the college. At least in Wisconsin, there has been a traditional idea that the college exists to serve the development of the entire state. (The Wisconsin Idea). Politically, it gives people in the far corners of the state a stake in supporting public higher ed.

Of course that's just one take on the mission of one school.


Texas is similar.

If you're in the top 10% of your graduating class, you have a spot in the University of Texas system. Though UT-Austin (the flagship) uses the top 8%.

Ref: http://catalog.utexas.edu/archive/2015-16/general-informatio...


Seriously there is no national standard for high school tests in Canada?


Correct. Not even provincial standards in many cases.


Quebec uses something called the R score to rank applicants to universities. The R score of a student is the sum of the student's Z score (how many standard deviations above the mean), plus the group's Z score relative to all students.

https://en.wikipedia.org/wiki/R_score


My spouse is a classroom educator.

Her school participates in Great Expectations program. http://www.greatexpectations.org/

It is a character development program "that provides teachers and administrators with the skills needed to create harmony and excitement within the school atmosphere"

There are 8 Expectations at the elementary level 1. We will value one another as unique special individuals 2. We will not laugh at or make fun of a person's mistakes nor use sarcasm or putdowns. 3. We will use good manners, saying, "please," "thank you" and "excuse me" and allow others to go first. and so forth. http://www.greatexpectations.org/expectations

In my opinion when the U.S. abandoned the judeo/christian ethic in the late twentieth century character and wisdom training was denigrated as "old fashioned" and unnecessary.


>2. We will not laugh at or make fun of a person's mistakes nor use sarcasm or putdowns.

I would have appreciated this as my public school experience was rife with negativity, sarcasm, snark, jeering, insulting, etc.

I'm not sure what causes such anti-social behavior but I dream of a day where all students can be kind, considerate, polite, and supportive of one another.


Having been brought up in a positive environment, yet full of sarcasm, I'm not sure I buy this argument.

Yes, there must be a balance. But sarcasm and negativity are part of our everyday life as adults and we should learn how to handle them.


The best way to handle sarcasm and negativity is, in my opinion, to reduce contact and/or interaction. To not reward such behavior and instead choose to associate with positive people.

The problem is that school-kids are forced into repeated, unsupervised interactions where they are often deliberately and persistently targeted by aggressors.

Out of curiosity, what purpose does sarcasm serve you? What do you get out of sarcastic interactions?


>Out of curiosity, what purpose does sarcasm serve you?

"Ha ha, only serious". Gentle mockery can be a non-threatening way of expressing grievances and telling people difficult truths. A joke can be a very useful way of letting someone know that they're annoying us, that their work isn't up to par or that their haircut is unflattering. It's why the British excel at it - we're not very good at blunt truths, so we tend to make a joke of things.

A classic example might be the British way of greeting someone who is late - a gently sarcastic "nice of you to join us". It gets the message across without being a direct admonishment. Most of us would be unwilling to directly criticise a colleague for slacking, but we'd find it far easier to sarcastically remark "you must be rushed off your feet".

Nobody wants to be surrounded by relentlessly negative people, but uncritical cheerleaders can be just as harmful. Sometimes we need to be told things that we don't want to hear, lest we turn into vainglorious prima donnas, drifting through life with a total obliviousness to our obvious shortcomings. Sarcasm, irony and gentle mockery can make that bitter pill a little easier to swallow.


Or you could directly tell them they are annoying or that they work is not sufficient instead of being passive aggressive and hide behind jokes. Bonus will be that even dudes and dudettes with asperger will know what hostility is all about.

Sarcasm turns factual debate about performance into personal attack - and people have full right to respond in kind.


It's communication. If you communicate what you want and not what you don't, it's successful by definition.

The point made above is that sarcasm can be a way of communicating what you want (eg, "you're late") without what you don't want ("you should be ashamed/feel bad/apologize/etc"). This unwanted implicit communication is common in blunt statements of fact and is part of why that communication style is often described using words like blunt or harsh.

In this sense, sarcasm and similar serve the opposite role to the one you describe: a way of jokingly or obliquely raising criticism without demanding a direct response. Of course, those criticisms can be personal-- but that's a property of criticism, not of its style of delivery.


It's stops people's egos getting too big, and keeps them grounded. Verbal play, attack and defence is also an education for more subtle interactions later in life. Like it or not, adult life is a sea of competition and words are weapons.


In some domains this is absolutely true. If you're lucky, you have options to explore outside of those domains.


If you handle other people’s sarcasm, cynicism, sardonicism, nihilism, ... by eliminating interaction with them, you’ll miss out on hanging out with some truly hilarious and insightful people throughout your life. YMMV.


My school wasn't rife with it. Maybe only half my teachers used sarcasm, mockery and negativity towards students.


Half of teachers being outright hostile to students is far too many, in my opinion.

How does this behavior positively impact the lives of their students? Do they not understand the psychological damage they can be inflicting? Do they simply not care?


Adolescence is an exercise in accumulating scar tissue. What you call damage, I can argue is growing up. You need a thicker hide and a more independent source of self worth; reliable continual external validation inhibits, imo.

Teachers are just other people too. The sooner teenagers recognise it fully, the better placed they'll be to choose their own way rather than have it chosen for them.


If it's done in a respectful, consensual manner then sure... there's nothing wrong with friendly harmless teasing. The problem is that it often crosses a line into bulling with zero net benefit for the victims.

Saying something like "What you call damage, I can argue is growing up" is a dangerous phrase; the type of dismissiveness that gives me the impression that you may have difficulty with respecting the boundaries of others. Could you please provide a couple of examples of the type of conversations/comments you've made that helped people build a thicker hide? Have you ever been thanked for it?

That said, adolescence is absolutely not an exercise in accumulating scar tissue so much as learning through trial and error how to interact with others, gain an education, and learn life skills to become a functioning member of society.


They don't recognize it.


I'm going to guess merpnderp was indulging in a little sarcasm of their own.


In my early teens, I was the only one who used sarcasm in the classroom, and none of my peers understood it. There needs to be more sarcasm in schools.


I've always felt overuse of sarcasm to be indication of a juvenile mind that thinks it's more clever than it actually is. Occasional use is fine.


Why does there need to be more sarcasm in schools?

I'm willing to hear you out, but would appreciate it if you could clearly explain the benefits of increased sarcasm in schools.


Sarcasm is a great rhetoric tool.


The cause is competition.


In many ways the trend if for the US over the last 200 years to become less ethical the more formally religious it is. (As measured by things like church attendance.)

If you can site some counter evidence I would be interested in reading it.


That's not all that surprising: outward religious formality is a sign of religion being tribal identity rather than substantive ideology.

There's​ a degree to which some degree of tribalism sis useful for promoting continuity of community and preservation of ideas, but there's also a persistent danger of degeneration into empty tribalism preoccupied with outward form.


Sounds like you have an axe to grind, especially since you cite no evidence yourself... How would you measure "less ethical," anyway? Are we more ethical now than 200 years ago or less, in your view?

According to Gallup, 91% of Americans in 1948 claimed to be Christian, compared to 69% in 2016. [0] I would assume "formally religious" is tracks similarly. Personally, I think we are less ethical, but since I wasn't around in 1948, hard to say.

I think ethics is related to your value of what is right and wrong. In 1948 people by and large had a "Christian" outlook on what is right and wrong and why. In 2016, the more popular claim is moral relativism, namely that there is no absolute right or wrong. Given that I can make up right and wrong and they only apply to me and not you, that seems to be a recipe for unethical behavior. In fact, I'm not sure "ethics" is a meaningful word if you subscribe to moral relativism.

Say what you like about "formally religious," but I think it offers a much better framework for ethics than the moral relativism we have now.

[0] http://www.gallup.com/poll/1690/religion.aspx


In what way was USA 1948 more ethical than USA 2016?

When I think of USA 1948, I think of things like Jim Crow, women requiring their husband's permission to open a bank account, interracial marriage being forbidden, homosexuals staying in the closet for fear of their lives, young men being forced into military service against their will, and lobotomy as standard psychiatric treatment.

But, you know, there's occasional swearing on TV now, so maybe it balances out.


USA Murder rate 1948 : 5.9 vs 4.9 (2015 aka now) I don't know about you but I would call Murder a rather extreme unethical behavior. https://en.wikipedia.org/wiki/List_of_countries_by_intention...

Now, compare with say France a very secular and minimally religious country and their murder rate is 1.31 (Though not 100% apples to apples this extends across many other similar statistics.) : http://www.nationmaster.com/country-info/compare/France/Unit...

Surprisingly China and Japan perhaps the most atheist countries out there with significantly different economic situations also have lower murder rates than France. (Though again not apples to apples statistics.)

PS: I often hear about morality religion links but only see a few meaningless connections like church attendance and divorce rates.


> According to Gallup, 91% of Americans in 1948 claimed to be Christian, compared to 69% in 2016. [0] I would assume "formally religious" is tracks similarly.

This is kinda tangential, but according to a very interesting panel discussion a few weeks ago "formally religious" (as measured by church attendance) as actually been more or less constant over this period. The 22% drop includes few who regularly attended religious services. The relation to the panel discussion, and somewhat to this one, is that the drop in nominal adherents has been part of what has fuelled the moralist resurgence on the right since the 90s.


> In 2016, the more popular claim is moral relativism, namely that there is no absolute right or wrong.

No, it's not.

Conservatives love to pretend that disagreement with their values is rejection of all values, but that is simply not the case.


I've never heard conservatives say that rejecting their values is a rejection of all values. I don't think that a conservative would say that liberals have no values; liberals have very clear values. I can't speak for the ones you've interacted with, but I've never heard that.

Regarding moral relativism, the argument, as I understand it, is that post-modernism rejects any meta-narrative; you decide for yourself what the narrative is. Moral absolutism requires a meta-narrative of some form. This is right because God told us it is, or because this is the core value our nation is founded on, etc. So without a meta-narrative, you have to define your own, and so you are left with moral relativism.

It's possible that post-modernism is no longer widely held. I'd probably be the last to know about it. If that's the case, then maybe there is moral absolutism. Certainly there seems to be an idea in some circles that protecting the environment, and/or everyone has a right to express their sexuality however they want to are absolutes. But a truly moral absolutism provides an absolute basis, and I'm not aware of any absolutes for these. Environmentalism is a pragmatic source: if we don't do it, we might die off, but perhaps dying off is actually best. (I don't agree, but philosophically speaking) And what basis is there for everyone having a right to doing things? There are things we decided we don't have the right to do (kill people, for example). Why, exactly, does everyone have the right to express their sexuality however they want? Moral absolutism requires some fundamental, unalterable reason. Moral relativism simply requires "I think this way."

I'm not very sure what most people's world views are, but everything I am aware of points to a moral relativistic view, rather than a moral absolutist view.


> I've never heard conservatives say that rejecting their values is a rejection of all values.

Neither have I, but I've frequently heard them characterize groups that explicitly adhered to different values from theirs as rejecting all values.

> Regarding moral relativism, the argument, as I understand it, is that post-modernism rejects any meta-narrative; you decide for yourself what the narrative is.

This might distantly approach relevance (leaving aside questions of it's accuracy) if the left, either in the general sense or in the peculiar American sense that includes much of the center-right, was generally post-modernist. But that is not, and has never been, the case.

> Moral absolutism requires a meta-narrative of some form.

No, it doesn't. In fact, because you can't actually logically derive an ought from an is, a meta-narrative doesn't even add support to moral absolutism (or any other moral position.)

Any morality requires taking certain moral beliefs as unsupported axioms, and absolutism just requires that the those axioms don't include that the morality of an act is dependent on the actors view of the morality of the act.

I can assure that liberals regularly believe that conservatism is wrong independent of conservatives belief in its rectitude, which absolutely is moral absolutism.


> Personally, I think we are less ethical

What does this mean to you?


When I wrote it I was thinking of the lack of statesmen. I can think of a number of politicians from 1800 - 1960 who were Statesmen. I'm thinking someone who is wise and who is willing to make sacrifices to bring about a better future for people he may not even know. I can't think of any major politician since then that really stands out as a Statesman.


Can you give a couple specific examples?


I'd love to understand what you base that last sentence on.


While standardized tests are criticized for rewarding general intelligence at the expense of other strengths, it seems that college admissions has been steadily devaluing general intelligence over the years.

Correspondingly, the alphabet tests increasingly reward rote memorization, diligence, and repeated practice at the expense of general intelligence.

When I was in college the most disturbing trend was students who expected all tests to be things that they could study for using the technique of rote memorization and diligence. To a large extent, pre-law and pre-med curricula offer this, and so many of the students who excel using that technique end up in positions of social authority.


When I was in college the most disturbing trend was students who expected all tests to be things that they could study for using the technique of rote memorization and diligence.

I remember being accosted by a squad of students, who were there to make sure I was going to be a good little TA and run my course according to those "rules."


I did some tutoring in a graduate program. There were definitely some students who were, for reasons of academic background or otherwise, barely able to hang on. And they were very focused on exactly what was going to be on tests and what they really needed to know to squeak through.


> When I was in college the most disturbing trend was students who expected all tests to be things that they could study for using the technique of rote memorization and diligence. To a large extent, pre-law and pre-med curricula offer this

I can't speak to pre-med, but IME pre-law curriculum tends (unsurprisingly like law school itself) to include courses for which rote memorization and diligence is necessary but not sufficient to exam success; specifically, where there is a mass of detailed material (rules and their context, mostly) which must be memorized fairly exactly, but which must be applied to novel circumstances in free response essays in exams.


Yeah, people with law degrees are probably overrepresented in politics, but pre-med?


Healthcare is one of the biggest areas of government, and administrators in the field often have medical and/or public health degrees on top of pre-med or substantially similar undergraduate coursework.

For that matter, I think that medical doctors are substantially overrepresented in elected general government office (state and federal legislatures, etc.) even though not nearly so much as lawyers. (e.g, Congress is 2.8% physicians, while the US adult population is around 0.4% physicians.)


Ben Carson is heading HUD after all, welcome to 2017.


The article mixes two ideas that partially contradict each other.

1. Tests like the SAT, ACT, the GRE [...] You end up with people who are good at taking tests and fiddling with phones and computers, and those are good skills but they are not tantamount to the skills we need to make the world a better place.

2. Do we know how to cultivate wisdom? Yes we do.

I agree with the second statement, and its implications are huge. If we give people good education and good role models, that people are going to be wiser.

I don't agree with the first statement. There is still a lot of prejudices against software developers and other high skilled professionals that are part of America's culture. It correlates high IQ with low creativity. Can't high IQ people learn wisdom? Is not part of the second thesis that it can be taught?

The best example I have against the first thesis is how developers behave at the companies that I have worked for. These are Swedish companies that value collaboration, personal live balance, and other social skills. People at this companies are social, technically brilliant, they participate in team building activities, we go for beers together, there is more gender parity (but still is very low), etc.

If a company values and rewards this values day to day, people are social animals and behave as such. If a company asks developers to work weekends and has no vacation time. Are you surprise to find only angry, non creative, non social people?


Why do you think American developers are non creative non social people? Both conference/hackatlon/meetup culture and startups who sell themselves to potential employees on cool cultures suggest that American developers are way more social then they like to admit. Non-social people tend to produce culture that is less fanatic to various socialization ways. Add to it companies that really follow agile processes extremely close teamwork and occasional pair programming and you get industry where shy or introverted person is at pretty high disadvantage.

I mean, so many here measure developer worth by participation on hackatlons and talks on meetup. Introverted or loner engineer is in serious disadvantage there.


> conference/hackatlon/meetup culture and startups who sell themselves to potential employees on cool cultures suggest that American developers are way more social then they like to admit

That's my point. They are social, but they can't be like that in companies that value long hour days and weekend work.

It is not the developers that are not social, but that "old fashioned" companies force them in a more restricted, non-creative behavior. In a traditional waterfall model, developers are supposed to be implementers, not thinkers. In that kind of company, everything is hierarchical and creativity is not expected. So their developers behave that way. And I will guess that it is that the model of developer that the article is talking about.

The article, for sure, is not talking about the developers that you describe. And that's my problem with the assumption in the article that people that are good at a technical problem are not social. :)


It is startups who require the most hours per week and proof of out of work passion, not just old fashioned companies. Maybe even not mostly old fashioned companies. I think that stereotype of non-creative non-social developer is largely overblown - fed more by movies and gossip then actual reality. I have seen some statistic about developer being more introverted then general population, but that is about the difference. Introverted does not mean non-social.

I don't really know what you mean by traditional waterfall model and creativity - waterfall was more of process organization. Whether you get to be creative or not depends mostly on business company works in and personalities involved. You can not be creative if you implement insurance calculation and you can be creative when you are styling cheezy art shop.


If we give people good education and good role models, that people are going to be wiser.

I think the structure, culture, and emergent properties of web forums have shaped out culture in some insidiously highly negative ways. I think that the totalitarian and high censorious nature of many web forums has conditioned entire generations to subconsciously expect the world to have the same totalitarian and high censorious nature.

I see multiple instances of younger people getting "organized" online, only to engage in mass-groupthink stretches of logic and commonsense in order to manipulate some authority to do what they want under threat of their emotional toxicity, or "yellow journalism" style association of that authority with something unsavory. Never mind that what they want contradicts established law, and that their efforts would best be aimed at changing those laws. Many of the same groups of people seem to expect the authorities to have the same absolute and arbitrary power to change laws, regulations, or even the laws of physics in response to their emotional toxicity. (This applies on both sides of the left-right political divide!)


Idea 1 simply states that wisdom and creativity may actually be more important than the intelligence or knowledge being tested for. It doesn't insinuate that we cannot cultivate wisdom, nor does it say that the two cannot coexist. Merely that we're only filtering based on those tests, without also considering wisdom.

I do not see the contradiction.


Is this culture of working weekends and having no vacation time really the norm though?

I mean it seems to be in Silicon Valley and where startups are concerned, sure. But the average American likely doesn't work in that sort of company, with the kinds they work at (retailers, large corporations, etc) trending more towards a normal working week with more reasonable vacation options.


As a high school student I find the hating on the SAT/ACT a little concerning, since once you remove standardized testing, admissions is primarily based on GPA(which imo has massive flaws the most important flaw being lack of consistency across schools). Also this article doesn't sound all that "scientific" to me.


In addition, admissions based on GPA means that students are motivated to choose easy classes and lenient teachers whenever they have choice. IMO, system should not punish students for choosing out hard classes - if anything it should reward that.


Nope. Some classes are weighted higher to prevent this.


I don't remember my high school report card having a difficulty rating on the class. I would be very interested in data on the number of high schools that do this weighting.


When I was in high school in the Dallas area, AP classes were all weighted one-step-up-the-scale. A's were 5, B's were 4, etc.

I forget the max GPA in our class, it was somewhere between 4 and 5, but I don't think it was the theoretical max you could get if you took as many AP classes as possible and got A's in all of them.

Colleges weren't stupid about it, either, though. My brother had a lower GPA since he went to a smaller school with very few AP classes available, but did better relative to that curriculum than I did relevant to mine's, and he got more scholarship offers and quicker admission offers at the schools we both applied to. Even though my SAT was 10 points higher.


We had pretty strict biology teacher and pretty lenient physics teacher - both classes equivalent of AP. So I would pick physics if GPA mattered to me a lot.


AP classes graded at 5.0


Ok, we didn't have (and they still don't) have AP classes at my old high school. How many school districts have AP classes that go to 5.0?


I agree with you, but if I were an admissions officer I would also like to see the SAT/ACT be two different numbers: one being your score when you took it as a freshman and a second from late in your junior year so I can see the delta. If you're a great test taker and both scores are equally high I can see that, but I can also see the case where someone with a very low initial score might have improved it greatly to an average score and I can infer that they might not have the best IQ, but that they are at least working hard at improving, which is also a valuable signal from an admissions perspective. Maybe we should standardize when the tests are given and that there be multiple scores recorded over time.


Or maybe they have high IQ, but were unmotivated to study for or perform during the test the first time. Maybe they had just bad studying strategy for the first time and wised up second time.

Low IQ would mean that studying wont help you - leading to low results consistently. High IQ means you are capable of good results. The rest is down to a lot of factors that are not IQ.


What makes you think basing admissions on SAT/ACT does not also have 'major flaws'?


I imagine because being nationally standardized, the SAT/ACT more reliably correlate to IQ. A 4.0 GPA means nothing but an A in all classes, which may give some insight into Conscientiousness, but it says nothing about what the classes covered, what the school averages at, and is thus harder to correlate with IQ.

Since IQ+Conscientiousness is better than either alone, of course they'll want to consider both.


IQ has "major flaws" too.


It's a game.. basically you have to prep for the test.


honestly it seems to me that many that bash them are just insecure they didn't do well and therefore claim that it isn't a valid measure.


There studies that seem legitimate that have found that a certain amount of prep can boost your score multiple tens of points.

This is problematic if you want it to be a valid measure because it lets people artificially close the gap.

Do you want the student with such a well-rounded education throughout their life that they got a >=99.9% score without needing to study, or the one who caught up by grinding for a few weeks without that same broad underlying fundamental base? If your test can't tell that apart, it's flawed, because access to the right prep is going to become a significant part of what you're measuring, and that access is (for obvious economic reasons) distributed along unequal lines, reinforcing the already-fortunate.

My suspicion, since I doubt we can come up with a test to prep, is that the best answer is either (a) nobody preps or (b) everyone gets access to a lot of prep. The former seems impossible to enforce, plus is the sort of "individual liberty"-constraining thing that the politically dominant swathes of America hate. They want to be able to make their kids the best, they "earned it." The latter would be easier, but isn't without some complications, and of course is time that could be better spent on something other than maximizing a single metric.


Well it stands to reason that if a test isn't a valid measure then those who are hurt most by it will be the most vocal complainers. That doesn't invalidate what the complainers have to say. There are numerous studies showing the poor validity of these standardized tests.


depends what you mean by validity. they have been shown to strongly correlate with IQ.


There is an extraordinarily dangerous subtext to this argument and it isn't very well hidden.

>Wisdom is about using your abilities and knowledge not just for your own selfish ends and for people like you. It’s about using them to help achieve a common good by balancing your own interests with other people’s and with high-order interests through the infusion of positive ethical values.

The author is essentially arguing that "intelligence" should be redefined to include not only practical problem-solving abilities, but a specific set of moral values. He is arguing that it is acceptable to reject an otherwise excellent college candidate because they're a nihilist or an objectivist or a nationalist, so by his standards are "unwise" or "lacking in ethical intelligence".

Right now, America is in the midst of a vast rebellion against the scientific consensus. Millions of Americans reject essential and well-proven scientific theories, in large part because they believe that academia is a highly politicised liberal institution. They don't believe the data and they don't trust the objectivity of the people who collected it. I very much doubt that making college admission contingent on holding a certain set of ethical beliefs will help to challenge that view.

Conservatives are vastly under-represented in academia and have been for many decades. Many conservative students and academics feel that they are actively discriminated against on an institutional level. Whatever your political beliefs, this should be deeply concerning to you, if only because of the ammunition that it hands to the likes of Trump.

http://www.hup.harvard.edu/catalog.php?isbn=9780674059092


I'm with you. Defining wisdom as out-group altruism makes it a sort of zealous, unconsidered enforcement of the status quo.

I think there probably is a case to be made that semantic intelligence at scale harms a society, but this sciam interview doesn't present a good argument.


There's an assumption in this article that virtue - using your intelligence to benefit humanity rather than self or tribe - is a character trait, either one that's inborn but should be selected for or one that can be imbued by education and then is relatively unchanging. There's pretty ample evidence [1][2][3][4] that this assumption is false, and that people are highly situational in their moral reasoning.

In particular, expectations of abundance and reciprocity seem to factor particularly highly in peoples' moral calculus. Most people are quite willing to undertake pro-social, common-good actions when a.) they will not be personally hurt by it and b.) they have a reasonable expectation that the beneficiaries of their largess will turn around and either pay it back or pay it forward. When those conditions are not met, people have a tendency to act like locusts [5], and try and secure their personal resources before considering what that might do to sustainability or the larger world around them. Doing so is evolutionarily adaptive, as it ensures that they'll be the ones to survive and pass on their genes even if the rest of humanity dies.

[1] https://en.wikipedia.org/wiki/Milgram_experiment

[2] https://en.wikipedia.org/wiki/Eichmann_in_Jerusalem

[3] https://en.wikipedia.org/wiki/Fundamental_attribution_error

[4] https://en.wikipedia.org/wiki/Trolley_problem

[5] https://www.ribbonfarm.com/2013/04/03/the-locust-economy/


The article/interview claims that we can cultivate and test for virtue (creativity and wisdom), and that our current tests used to funnel intelligence into higher education and intelligence work isn't currently accounting for it. I'm not sure where in the article the claim is made that it cannot be improved upon.


"Improved upon" assumes something that is constant on the level of long-term worldview, not something that may change situationally. You wouldn't usually say that your choice of what to have for dinner or which sweater to buy can be "improved upon", but many of the individual choices we make that lead to the world we currently live in are on that level.

I guess the article does mention engaging the ethical reasoning centers of our brain and training them to realize that it's an ethical problem at all, but my argument here is that there's a reason most citizens in developed countries don't think about the moral implications of, say, buying an iPhone made in a sweatshop in China that uses rare-earth minerals that pollute the environment, and that's because the number of choices we face on a daily basis and the moral complexity of all the consequences of these would paralyze us into inaction if we did - in which case, those people who don't bother to think about it would dominate consumption.


I see what you're saying about the virtues being "different based on the situation and/or other variables." However, that doesn't mean you can't "improve upon" the ability to make virtuous choices.

In your dinner example, if someone has only math and finance education, they might simply say that fast food is the best choice for dinner. But if they get some nutrition education, suddenly they're wondering if maybe something healthier might be worth the financial trade off. So you can absolutely improve decision making with education. That example didn't include virtues, though it could include making decisions for others (i.e. your family) that improve their lives. Of course, wisdom about nutrition is a more direct example of something you might not teach or test for, but in this case, it is useful.

Sure, there are situations where the choices are just between two fast foods, and neither choice will seem wise, but again, if only taught the individual math rather than creativity, they might not realize that there's also a fresh produce market (i.e. a third choice!) hiding around the corner.


Sure, and this type of moral education is actually done in schools. I grew up in the 80s and 90s, and we had a unit on water conservation; a unit on energy efficiency; a play we wrote about rainforest destruction, endangered species, and global warming; recurrent reminders to recycle; programs for anti-drug and anti-smoking initiatives; sex ed; and pervasive lessons on respecting each other and authority.

I haven't had any connection to the school system since 2000, but from what I've heard, kids these days also get lessons on nutrition and on diversity & inclusion.

But here are three major complexities:

a.) A good portion of America believes that this sort of moral education is indoctrination. You're seeing the rise of Christian homeschooling as a reaction to it, for example, which proponents of these values find just as morally abhorrent.

b.) There's an opportunity cost to everything you teach. So while diversity & inclusion has gotten a lot more classroom time of late, the importance of free speech seems to have fallen by the wayside. While we hear a lot about global warming or fossil-fuel exhaustion, there's little about groundwater pollution, rare-earth mining, or labor rights.

c.) Once people get into the real world, they discover markets. Markets reward people who supply what others do not, which means that if you have a cohort of idealistic young conformists, the one defector can make lots of money simply by breaking the consensus. In many ways, this is exactly what moralists complain of, and what led off this article: people who break common ideas of virtue are getting very rich off of it. And yet we don't want to get rid of markets, because they supply the things that we want but somebody else would rather not supply us with.


That's a good analysis , particular the part about defectors .

And under that analysis, it seems that wisdom education , even of the deeper style that Buddhists or Christians do , will likely to fail, because a single defectors amplified by tech could do so much damage.

So are there any solutions to that?


The trick is to teach people to express their values by consumption, not by production or investment. Most people have the intuitive sense that they are a moral person when they personally don't perform any acts they consider immoral. This sense fails when they can pay someone to perform an action that they would consider immoral but with a consequence they desire, all without their knowledge.

The one place, in a capitalist market economy, where values have a place is in the endpoint, at the consumer. Consumer markets really depend only on values; by definition, they are those goods that people buy because they value them inherently, not those that they buy to achieve another goal. The rest of the economy self-adapts to produce those goods as efficiently as possible, which often means production practices that people would find morally repugnant.

If people trained themselves to a.) seek out as much information as possible about the supply chains of the companies they buy from and b.) not buy from them if the externalities introduced by the company aren't in accord with their values, then the market would self-adapt to reflect those preferences. It effectively creates a market for intelligent entrepreneurs to do the right thing, by influencing consumption desires so that there's a profit potential in being good.

There are still some very significant challenges in bringing this information to the consumer, and in people integrating all that information into their choices. A lot of the benefits of market economies is that they condense a lot of information from each stage of the production process into one number, a price, which propagates throughout the value chain so that self-interested actors end up producing a good in the most maximally efficient way. If you want to encode moral judgments and pass them through the value chain, you need a lot more information to go from producer to consumer. And businesses have a lot of incentive to cheat and conceal this information, so accurately recording & transmitting it is challenging.

Actually, I just had a crazy half-baked idea around using vectors instead of scalars for prices and encoding this information in a cryptocurrency, but the margin of this HN comment is too small to contain it. Maybe someone else can run with it.


It changes situationally, but not everyone acts identically in the same situation, so it must vary personally too, no?


I would assume so, yes.

Additionally, the same action might be perceived in a different moral light by different people. Just look at how the same events are reported in Breitbart vs. CNN vs. Pravda vs. Al-Jazeera (and read the comments on them) for an example of this.

The complexity of moral education is that you have to balance situation vs. individual vs. audience, and it's very difficult to do this in a consistent way. Propose any concrete curriculum, and it's almost guaranteed that some people will scream about how you're imposing your values on the rest of society, and at least one kid will take the exact opposite lesson from what you intended and do something you found morally abhorrent.


I dropped in on a physics class at my uni and i was surprised at how poor the students understanding of physics was. These are super sharp people and they struggled with simple problems. Its because they learn in this testing way. They dont actually understand the material. Thats the most direct example of this problem that ive seen. In order to fully understand things like you ought to, you have to take the initiative yourseld on top of studying for tests that are often made arbitrarily hard with memorization and other things of that nature. Most of the college kids i know dont really know anything. And just like this article said, they are only good at gaming the system and advancing their own self interests. There is no big picture. They never wonder how they might be changing society and they never wonder how they might want to shape society. Even my 40 year old room mates dont. Its a death scentence for this country. An absolutely lethal development. I think the recent political developments are only the beginning of things to come due to this problem of apathy.


> I think it’s hurting everything. We get scientists who are very good forward incrementers—they are good at doing the next step but they are not the people who change the field. They are not redirectors or reinitiators, who start a field over. And those are the people we need.

I doubt the problem is with the scientists. I know and work with lots of really smart and creative people in an academic environment, and I'm sure some of them would be capable of redirecting, reinitiating, of starting a field over -- were they given a chance. But just try to get funded for anything at all that isn't incremental and see how well you do. Scientific funding is so scarce, and becoming more and more so, that funding agencies only approve grants that are practically guaranteed successes, and those tend to be the incremental ones.

Maybe narrow testing is responsible for the plague of incremental science, but if so it's probably not the scientists' fault, but more likely because we've cultivated a society that isn't creative and thoughtful enough to appreciate the value of really novel science.


Most of the need in science is in advancing another step. For every major advancement we need tens of thousands of small advancements to refine and work out all the details. Physics was 99% solved in 1880 with just the work of finishing out the constants. In that 1% we were missing was relativity and quantum mechanics, and we know that there is something more but not what. However as big as those two are they are still < 5% of the physics needed are. (The numbers don't add up because now that we know about relativity we know it is more than 1%, but it isn't a significant factor in most real world needs which still get by on newton physics)

Yes there are areas of complete unknown in science (all over). However they are complete unknowns: you can spend a lifetime in false starts.

In Computer science the P=NP problem has been an important unknown for years and many have spent a large part of their lifetime on it and we still are not close to a solution. This is a known unknown, must of what remains is not even that well defined, just a case where "something funny" happens and we have no idea why and often can't figure out how to replicate it on demand.


The ancient Western word for this is virtue: to cultivate virtues as part of an education. It's not a novel concept, but it rather got shot in the head in the postmodern revolution in education, and we're a lesser society for it.


Do you have links to old educational material where this sort of thing was part and parcel of the pedagogy? I am interested in getting into homeschooling, especially focusing on character development. I suppose an emphasis on the humanities would be a good first step, but I don't know where to find the "leading lights" in this kind of old-school education.


Not the OP, and it's a rather old book, relatively speaking, but you might also give Jean-Jacques Rousseau's Emile, or On Education a chance.

It's got nothing pedagogically scientific in it, at least compared to today's standards, but were I to have a kid I'd base my trying to teach him about things in life on what this book has taught me (it also talks about lots of other interesting things, not only pedagogy). Reading it I realized that my dad was at certain points following this book almost by the letter (I've never asked him if he had read it, though), and on the book's wikipedia page there's mention of the Montessori method being some sort of follow-up to what Rousseau was thinking in terms of educating children.


I would look into experiential education/learning.

Dewey is considered the modern father of experiential education. The main text is Experience and Education (1938). Kolb (and many other folks) have written on this topic more recently.

When done well, this is an amazing way to learn.


Well, here's a/the start to "old school":

http://classics.mit.edu/Plato/meno.html


The keyword is "Classical" education when you're looking for homeschooling materials.


I'm not familiar with the idea that there was a 'postmodern revolution' in education (or even what that would look like?) What changes are you referring to?


The essence of the shift in the postmodern worldview is the belief that truth and virtues are subjective; that communications are fundamentally impossible due to the differences between people; at the same time, teaching and the knowledge of the old are predicated on a mode of oppression. This has largely been laughed off by STEM people, because it is hilariously wrong. Unfortunately, it got into the liberal arts and humanities and poisoned their ability to make virtue judgments and compelling moral cases.

Historically in the US, the complaints are more from the conservative POV, but I'd not let that deter you. Bloom's Closing of the American Mind is probably one of the more famous ones. The Sokal Hoax was a famous shot of the matter.

There's more than a small amount of "gerroff my lawn" in the critiques, but the substance is more than just moaning about "kids these days", it's a deep philosophical worldview difference that has ramifications across society.


The subjectivity of truth[1] and morality is relativism; focus on oppression and power is an element of critical theory.

Both are regularly conflated with postmodernism, but neither is intrinsically postmodernist - relativism predates it by approximately forever, and (contemporary) critical theory is a child of the postmodernism of the 1980s and 1990s (with origins in the late 1930s).

There are good reasons to be skeptical of both, but it's a gross exaggeration to say that they've "poisoned" the humanities. My experience as a philosophy major has been full of extreme (and deserved) skepticism towards relativism. Critical theory has grown in popularity, but it's nowhere near the boogeyman that many (including my peers in STEM!) seem to make of it.

Individuals vary, of course, but "postmodernism" never took over the liberal arts. It's just another school of thought, and not a particularly popular one at that.

Edit:

[1]: I don't want to give the impression that most (or even any) relativists believe that all truths are relative. This would, again, be a tremendous mischaracterization - most are not interested in challenging empirical truths.

Instead, most relativists are skeptical about the absoluteness of moral truths. I happen to be a (somewhat firm) moral realist, so that's all very silly to me. However, it's the difference between a completely incoherent view (all truths are relative) and a coherent view that I think is probably incorrect (moral truths are not absolute).


See "The Myth of the Postmodern University".[1]

What the above article is missing, however, is that "postmodernism" in English departments, Cultural Studies, Women's Studies, etc, has been long been boogymen among right-wingers like David Horowitz,[2] who don't like criticism or questioning of the oppressive social systems they benefit from and take the position that where such criticism is dominant, conservative and right-wing views are marginalized. That's really what's at the core of complaints against "postmodernism" in universities.

[1] - http://leiterreports.typepad.com/blog/2008/10/the-myth-of-th...

[2] - https://en.wikipedia.org/wiki/David_Horowitz


Oppressive social systems? You mean like the capitalist patriarchy boogeyman?

I'm all for questioning social systems because goodness knows the ones we have contain many problems.

But the article you linked is from 2004 and out of date with regards to what is happening. Postmodernism in universities is moving into the social sciences and attempting to destroy the bedrock of modern Western civilization.

It wouldn't be so bad if proponents had something better to replace it with - but all they have is a labored version of Marxism that increases racism and segregation, and lowers meritocracy through identity politics. Social equity is the goal, but increased oppression will be the result.


"Oppressive social systems? You mean like the capitalist patriarchy boogeyman?"

I mean things like racism, misogyny, colonialism, social inequality, and yes, even the oppressive aspects of capitalism -- which actually do exist, much as some would like to spin capitalism as an untarnished blessing on the world.

"But the article you linked is from 2004 and out of date with regards to what is happening. Postmodernism in universities is moving into the social sciences and attempting to destroy the bedrock of modern Western civilization."

What is the bedrock of Western civilization, pray tell? And how does what's happening in the social sciences threaten it?

"It wouldn't be so bad if proponents had something better to replace it with - but all they have is a labored version of Marxism that increases racism and segregation, and lowers meritocracy through identity politics. Social equity is the goal, but increased oppression will be the result."

This has been the line of the so-called conservatism with a smile, which pretends to care for the downtrodden, the oppressed, and minorities. That line might fool many today who have little familiarity with history, but is real stretch for those who've seen the conservative boot stomp down again and again on those they are now pretending to concern themselves with. No, conservatism has always been concerned primarily with protecting and advancing the interests of the rich and powerful, and there's no indication that they've suddenly turned a new leaf and suddenly love their historical victims.


I don't accept that racism and misogyny are still embedded in social systems. Western colonialism doesn't exist anymore, unless you count overseas military bases. And capitalism is undoubtably tarnished, but few claim otherwise. Capitalism has lifted millions out of poverty and increased absolute wealth significantly.

The largest stomping boots belonged to people like Stalin and Mao, who were far from conservative and ostensibly for equity. History tells us who's interests they cared for.

Many people with power will attempt to retain and increase their power by exploiting others, regardless of their political ideology or the social system they're part of - it's human nature. Their corruptive success may be mitigated by suitable social organization, but my contention is that social Marxism will actually increase their success.

A primary goal of conservatives is preserving what works, at the expense of attempting alternatives. I'm not broadly conservative, but on this issue I think it's worth preserving the values that allowed the West to prosper. Judeo-Christian values like virtue and work ethic. Enlightenment values like individualism, free speech and personal property. Modern values like mass education in science and reason.

Postmodernism deconstucts and equalizes everything. It has no respect for what works in the real world or how difficult it is to evolve relatively stable and fair societies.

If you think the corruption in our current system makes it untenable, how do you propose stopping corruption from once again infesting a Marxist replacement? Would it not be better to patch what we already have?


Let's agree to agree that such criticism is within the intellectual purview of the university.

However, I posit that the essence of postmodernism is to deny objective claims to truth and virtue. Would you agree with that?


Education as job training rather than education for intrinsic value.


Historically speaking, I suspect there's a correlation between "virtue-focused" schooling and bodies of predominantly-upper-class students who are already nearly guaranteed to get an entry-level job in a prestigious field. The study is designed to help them compete inside a more-rarefied social circle.

Meanwhile, other students were in apprenticeships or going somewhere that focused on the on the three R's of readin', ritin', and 'rithmatic.

If the amount of virtue-schooling is decreasing per-capita over time, that may just be regression to a centuries-long mean.


> Historically speaking, I suspect there's a correlation between "virtue-focused" schooling and bodies of predominantly-upper-class students who are already nearly guaranteed to get an entry-level job in a prestigious field. The study is designed to help them compete inside a more-rarefied social circle.

Absolutely. The "liberal" in "liberal education" means "free from having to earn a living" -- education for the independently rich, the ruling class, to ensure that they can rule wisely, justly, and prudently. Those who don't have property yet have always been better off with a technical education -- a category which historically even included law and medicine.


Makes sense considering the U.S. is a society where you have no value unless you work. The presiding political party doesn't think you should have food and housing unless you have a job.


> you have no value unless you work

This subjunctive clause, stands alone as a true statement. How else does one earn money?


> This subjunctive clause, stands alone as a true statement. How else does one earn money?

Well, one can gain money from passive rents on capital without work.


You're equating value with money. I wasn't.


> You're equating value with money.

Well, unless we regress from currency to bartering, value is measured in money.


I think the parent comment is overstating the case, but the insinuation of critical race theory into some administrators' thinking is very troubling.


For a short book-length explanation of this, see "The Abolition of Man" by C. S. Lewis.


"Wisdom is about using your abilities and knowledge not just for your own selfish ends and for people like you. It’s about using them to help achieve a common good by balancing your own interests with other people’s and with high-order interests through the infusion of positive ethical values."

That's a really odd definition of "wisdom". That's more like "unselfishness" or maybe "benevolence".


I think it's a tolerably accurate definition; it's almost always more wise to cultivate enlightened self-interest instead of the unenlightened kind. Another generation of the current state of affairs, and the US will probably have another civil war -- and it'd be the height of folly to let things escalate to that point.


I see many parallels with the post and my own experience. I got my PhD in math and taught many college math courses during that time. I was always saddened at the number of students that essentially viewed math as just an exercise in the memorization of algorithms (memorize differentiation and integration rules and chug away at problems). I would avoid teaching this way and instead give them fewer, "deeper" problems that "exercised the concepts".

Students (the "good" and "bad" ones would usually go through the same transition throughout the semester: they would first be very uncomfortable and generally dislike it, then after a few weeks they would start to get used to the process of thinking things through, and finally they would tend to like it much more than what they used to do (of course some students would absolutely despise the process even at the end). Some of the students that liked the process the most where the "bad" ones who were historically worse at the algorithmic style (and unsurprisingly some of the ones who were very good at that style took the longest to convince).

Personally I think it's very important to focus on such a style of teaching. If students apply Calculus at work, they'll usually be using some form of abstract reasoning (say finding a reasonable model for some phenomenon) or they will be applying numerical methods (or both), but rarely will specific memorized differentiation rules be necessary. Ironically I believe that the memorization of the steps is most useful to those going into theoretical mathematics where of course being able to do as many "basic" things without thinking will always help you focus on the "true problem".

The main issue with this approach is that it is hard for students to know if they understand something correctly (which I would argue may be the single most important intellectual skill to learn in life). They can't just look up an answer and see if they got it right, instead they basically have to ask themselves if their reasoning makes sense. The difficulty in this causes a huge amount of student angst and pushback making it more uncomfortable for the students and more work for the teachers. I think the main reason why we teach math as mainly rote learning is because it's the easiest way to teach. It's my job to go through the motions and your job to learn the algorithms as best you can. From the teaching perspective, this is relatively easy to teach, grade and defend.

As a final example, I think that the focus in Calculus on the Fundamental Theorem of Calculus is misplaced. It _is_ fundamental from a theoretical perspective, but essentially no numerical integration is done that way and instead the actual way that integrals are computed is some application of Riemann's methods. (The counterexample being languages like Mathematica which are specifically designed to solve things symbolically.) I found a very good exercise to write Riemann integrals in Python (which students could quickly understand as pseudocode) and use different methods (left-endpoint rule, midpoint rule, etc.) to compute the integral in different ways and verifying that the convergence matches errors predicted. I think this is the sort of reasoning that should be focused on much more rather than having students find anti-derivatives all day and simply repeating "remember plus C".

This is a bit of a long-winded and rambling post, but I do miss that aspect of teaching. Pushing students to realize their ability to reason about math was always very fulfilling. Not everyone enjoyed it, but the majority would come out realizing that they too can reason about math given enough patience and time.


Very good points, I think fundamental to this is the reliance on the weird mixed theory/application version of calculus that is taught in our institutions: a great deal of it is generally just developing a bag of tricks to help you find analytic solutions for integrals of elementary functions. The reliance on calculus at all is I think because of this fact: calculus provides a huge number of problems which are easy to design and can be made easy or difficult to solve depending on the arbitrary tricks they require, whereas a theoretical approach would require fundamental proof-based analysis questions which are far more difficult to both design and solve. And a numerical approach while extremely informative useful and natural isn't suitable as a "mathematics IQ test" weed out course that many people seem to want calculus to be.

Indeed the math subject GRE was (when I took it) primarily many, many tricky calculus questions with a very strict time limit, because these problems are fairly straightforward to write and standard.


Awesome comment! I wish you were my math teacher back in the day. I was one of those "bad" ones and I've recently had to go back and (re)learn a lot so that I could get a better grasp of CS fundamentals and become a better software developer.


Thanks! It's definitely a long road. Unfortunately you'll realistically never be able to learn the basics as well as people who fused them into their heads well over the course of 12-15 years of education, but you what can do is "learn how to learn". If you acquire the patience and confidence, you will be able to understand better what you really need to understand. It's a long road and a lot of work, but if you actively work towards that goal, you'll definitely improve more than you ever thought you could. Good luck!


there is definitely two distinct intelligences. one that relies on knowledge and memorization and one in reason. Much of math is taught the knowledge way, similiar to history or biology, and a certain type of person excels at this. However, a different set excel at the reasoning approach. i don't think one approach can teach all people howeverZ


It's definitely true that there is no one size fits all. It's unfortunate that we can't truly tailor learning individually in a class setting (well maybe technology can successfully change this, but I still don't think it's the case). When teaching one on one it is much easier to quickly hone in on a specific student's difficulty, but rarely are you afforded the opportunity.

Also I don't want anyone to think that I believe that rote learning is bad per se. Memorizing basic steps is extremely important. That is what allows you to focus on the important aspects of a problem as opposed to the details. For example, you would be a fool to try to learn German grammar before you get a good enough base of vocabulary. In fact, I think one of the best first steps when learning something new is really memorizing the basics well.

However, I come from the perspective that rote learning has been overdone. Most people come through their math education and never leave that learning style. That is what I find unfortunate and wish were different.


You cannot be the type that relies on reason alone. You need a large set of facts to reason from.

I don't need 2+2=4 - I could reason it out from first principals (where did I get them from?). However when I'm solving a larger problem it helps to know that 2+2=4 without having to figure it out or put it in a calculator.

This is why all arguments fail to hold weight with me. You cannot have just one. You have to have facts to reason from, and you have to have reasoning ability to use those facts with.

Now I will grant that you can have facts alone. They won't do you any good but you can have them. We can argue about which facts in particular are worth knowing (do you need to know cursive), since there is only so much time in a lifetime to learn facts eventually you need to choose to not learn some facts. It means you cannot reason from those facts, but some facts are not as useful as others. 2+2=4 is a universal fact that should be early on the list of facts to know (bushmen would disagree - it isn't useful to their world). Where was the battle of Bunker hill fought - useless in Australia, part of our shared cultural heritage in the US.


So the 30 point IQ thing, is that a reference to a global increase driven by access to education, or am I missing something? Because it's not like 130 is the new normal IQ score? But at the same time haven't they always been normalized around 100 in industrialized countries?



> What I argue is that intelligence that’s not modulated and moderated by creativity, common sense and wisdom is not such a positive thing to have

I think he left out the most important category -- empathy/compassion/caring/"emotional intelligence".


"Excellent sheep" is same idea by another former Yale Prof, William Deresiewicz

http://www.billderesiewicz.com/books/excellent-sheep


This article was odd to me. It seemed to be advocating a MAGA world view, that kids used to be taught "values" in school.

Is this article a reflection of serious scholarship, or partisan pandering?


...IQ tests and college entrance exams like the SAT and ACT are essentially selecting and rewarding “smart fools”...

A lot of coding pathologies and programming language pathologies seem to be the product of such "smart fools." (Mea culpa!) There are many things which seem like good, elegant ideas, which don't work out well in practice.

Come to think of it, a big chunk of, "The devil is in the details," can be translated to, "The hardest problems are the emergent problems."


> There are many things which seem like good, elegant ideas, which don't work out well in practice.

Do you have any good examples of this in programming?

I'm starting to see a lot of the great parts of languages like Haskell, Erlang, Clojure, and other FP languages starting to flow upwards into more popular languages or in new more accessible forms (Elixir, Swift, Kotlin, Rust).

It's possible that many of the good, elegant ideas aren't exactly the wrong solution but are stuck in less accessible, fringe, or merely unpopular languages/platforms. Which might be more about the nature of how languages are adopted rather than the individual attributes and qualities of the language.


Java (1.0) had checked exceptions, if you threw something you had to say what it was in your interface. It Seemed like a good, elegant idea: if I use your API I know exactly what exceptions I need to watch for. It works well in trivial programs, but in anything larger it fails because adding a new exception to something core requires thousands of modifications to the API specification for an exception that I will catch only in a few places.

I think Java has made some improvements on this since 1995, but I haven't done java since then so I don't know what.


Huh? Checked exceptions are great tools for API and are still in use - especially in large systems. They fail within small modules, because they are less practicall in that context.


Do you have any good examples of this in programming?

The CORBA "sea of objects" concepts infesting Linux distributions and making everything sinfully slow. (Sometime just after Red Hat 7.) My own Smalltalk meta-level "browser helper" wrappers in the "new" ObjectStudio Smalltalk browser. (For making the browser extensible, great, but didn't really have to be instantiated until actually operating on the class.) The Smalltalk-inspired implementation of "everything is an object" by making everything a pointer to a different struct. (Not good for cache coherence!)

I'm starting to see a lot of the great parts of languages like Haskell, Erlang, Clojure, and other FP languages starting to flow upwards into more popular languages or in new more accessible forms (Elixir, Swift, Kotlin, Rust).

So it goes. Same thing happened with many OOP concepts -- both good and bad. I'm not saying that everything that rises is scum. The cream also rises. I did not say that all things which seem like good, elegant ideas are actually programming pathologies in disguise.

It's possible that many of the good, elegant ideas aren't exactly the wrong solution but are stuck in less accessible, fringe, or merely unpopular languages/platforms.

And some of the good, elegant ideas turn out to be bad ones when you start to scale in different ways.


It puts me in mind of the three minute degree[0]

> Education can produce what the Village wants.

> "A row of cabbages," Number 6 objects.

> "Indeed," Number 2 agrees. "But knowledgeable cabbages

[0]http://daviddeley.com/profdeley/thegeneral.htm


We need reliable ways to measure student achievement, knowledge and skill level. If we're not satisfied with the cost and negative externalities of standardized testing, then IMO we should be seeking ways to make the assessments that teachers are using in their lessons more accurate and trustworthy. We're already paying people to do the work, but we just don't trust the results. That's the problem - so let's fix it.


Yeah, isn't this exactly what Sternberg is saying? It sounds like he's done a lot of work precisely on improving the assessments.


Sternberg is arguing that what the tests measure isn't desirable (but presumably the implementation of the tests would be the same/similar to current tests?).

I suppose my suggestion is a little off topic as I'm addressing some criticisms of standardized testing with respect to cost/relevance to desired learning outcomes/inability for use in guiding a student's learning or teacher's pedagogy/taking time away from primary learning for test prep/and so on.

We currently spend a little less than ~2 billion a year on standardized testing, but the problems inherent in the form/format of these tests isn't changing. I believe we could get better results for much less if we turn our attention to creating a way for teacher to implement trustworthy assessments in their own classrooms and lessons. Why pay third parties so much money to do something that already being done (but which needs to be improved to make it trustworthy).


Probably a step up from people who are simply fools?


Misleading title. Thought this was about Cucumber.


^ Sarcasm.


[flagged]


In fact, current elite US college admissions seem to select for people who vote Democrat well enough. It would be hard to increase that percentage very much. Just look at the political affiliation reported by e.g. Harvard students. If anything, I would think that an institution which tested for wisdom might shift conservative a percentage point or two.[ Not because I think conservatives are necessarily more moral in the US; but because I think more conservatives are religious and churches teach the language of virtue, which probably at least better tells you what the 'right' answer on wisdom tests is, whether you live that answer or not.]


liberals and academics have tended to pretend values or virtue don't really exist while following the principles themselves. they tend to heavily participate in their communities, get married, and have kids later and don't commit much crime. However, they absolutely refuse to acknowledge that these behaviors are virtues or that anyone else should behave that way. the only way they really enforce their concept of virtue is forcing people to accept certain things about sex and gender.


> liberals and academics have tended to pretend values or virtue don't really exist

No, we liberals (academic or not) jusr disagree with conservatives (who tend to share values, again, whether academic or not) on some details of what good values are (and, equivalently, what constitutes virtue.)

Disagreeing with your values is not the same as pretending values don't exist.

> However, they absolutely refuse to acknowledge that these behaviors are virtues

I know of no liberals who would disagree with the idea that community involvement and not committing crime (provided just definitions of crime) are virtues.

Marriage is a different story, but then even many strands of socially conservative Christianity (such as traditional Catholic doctrine) don't view marriage as a general virtue, but rather as a virtue specifically for those with a vocation for it, a vocation which is no more inherently virtuous than that for either religious life, priesthood, or committed (lay) single life. It's true that liberals often see additional lifestyle choices as no less virtuous than these.


hi, academic liberal here.

I don't think I've never pretended virtue doesn't exist. And, to be frank, most of the liberal types I know, regardless of their (a)religion, believe there are virtues.


Look at the hundreds of billions of dollars the DOE has spent since its inception under President Carter and ask, why have scores and graduations rates not show real improvement? Why are we still asking the same questions we did back then?

If there were one Federal agency in dire need of dissolution it is that one. Give the money to states in grants and see which one comes up with the better solution


What if your kid lives in one that comes up with the worst solution?


Agreed this is an important question. I think the second question is "is the worst solution significantly worse than what we have now?". I don't know the area well enough to take a stance here, but it seems the parent is claiming there is little to lose and lots to gain. So do you agree with that, particularly the little to lose part?


My pushback was against the comment above mine, which blithely suggested doing an experiment where the states would sort it out. Of course, the worst solution will be pretty far off the charts. Never underestimate how far "worst" it can get! There are states/regions that either don't have their shit together, or neglect their second-tier schools.


A DARPA for education. I like that idea. EARPA. By what metrics would we measure their sucess, since people can't seem to agree on that?


I stopped reading at "IQ rose 30 points in the 20th century around the world.."


That is well studied. A large meta-analysis claims 2.93 points per decade, over most of the 20th century. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4152423/


The Flynn effect says IQ rises 2.93 points per decade.

https://en.m.wikipedia.org/wiki/Flynn_effect


The Flynn effect documents an increase in IQ, but does not say there will be a consistent 2.93 point increase --- in fact, the increase is not consistent, but rather concentrates in areas of lower socioeconomic status.


Kids eat better and we focus way more on both their early development and schooling then before. Why would be better results so surprising?


yes it seems mostly to be from the nutrition aspect as IQ is supposedly supposed to be immutable otherwise


That's not true. It is definitely not settled that IQ is believed by scientists to be immutable, and, in fact, a widely held hypothesis for the increase (shared by Flynn himself) is that the cognitive environment has become increasing complex, which is having the effect of "training" higher IQ.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: