Hacker News new | past | comments | ask | show | jobs | submit login

Sort-of.

Per my reading of it, they used only one measure, the NAEP. The GP mentions a paper that used the ACT as a guide. In the least, we should resolve the difference between the NAEP and the ACT, if there is any. But, like the SAT or College Acceptance rates, those are just a few measures. Achievement is not easy to define in this context, let alone measure. All the test scores are just proxies for 'achievement' as a general term. Should we measure household income at the 10yr post mark too, College graduation rate, number of pregnancies, marathon runners? It's all just a proxy in the end for trying to determine, in granular detail, if education is worth spending tax money on.




Aren't these all the same questions we should be asking when we decide to have public education? When we decide how much we will spend on it? How many years our kids will go? What will be taught?

What are our goals?


I mean, that's a question humans have been asking for nearly our entire species' history. Generally, it's a lot more fun to be around 'smarter' people than 'dumber' ones, so education is somewhat prioritized.


I'm getting at the idea that maybe we haven't been asking ourselves what are goals are here, and we're at a point where we're doing it this way because that's how we do it. And the fact that we are still wondering what a decent way to measure the results of our efforts are is kind of telling.


And the thing about tests is that they can be gamed. Especially ones like the ACT where it's all multiple choice. If a student knows just a little bit of math, some key vocabulary and how to work their way around a calculator, they could easily get a score that's 25% above average (mid 20s; average is 20.7). The same can be done with the other subjects as well. The biggest issue with students taking it is (1) unmotivated people who'll never go to college being forced to take it and (2) not knowing how to game tests.

The second one can be taught, though.


>If a student knows just a little bit of math, some key vocabulary and how to work their way around a calculator,

That's the stuff we want them to know. We want people who know more math and vocabulary to score higher. That's not 'gaming.' That's just the basic concept of testing.


But they don't have to know the math they're being tested on, that's the thing. And for vocabulary, they just need to know a few key words, like 'root'. They don't need to know how to find roots of quadratics, because it can be gamed with a calculator.

That's what I'm talking about; they don't need to actually know how to do the stuff that's on the ACT, because it's multiple choice and there's tons of tricks you can use instead. That's why it's "gaming" -- you don't need to know the subject.


It sounds like they found the roots of a quadratic to me. It's not like when I was in high school memorizing the quadratic formula that was any more useful than what you described. No one really gets to know math until they've taken Real Analysis.


Except the point is they don't have to know how to solve the quadratic equation. They're given the roots, so it becomes just a matter of plugging in answers. That's hardly a test of mathematical ability, and instead asks if you recognize that it's a valid test taking strategy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: