Hacker News new | past | comments | ask | show | jobs | submit login
Guessing the Teacher's Password (overcomingbias.com)
54 points by t0pj on July 11, 2008 | hide | past | web | favorite | 38 comments



Someone should write a book like the little schemer but for physics. One of the best classes I took in college was somewhat like this. We measured the period of the pendulum using our heartbeats, and we also measured how long it took things to roll down differently sloped ramps. From this we derived the period of the pendulum and the relationship between slope and speed. Eventually we used this to derive the universal law of gravitation using the period of the moon's orbit as the only given. It was pretty cool because there was no math involved. Instead of using calculus to derive the equations, we instead found them inductively by iterating over a couple thousand lines in excel and then looking for the pattern. The idea of the class was to understand physics by deriving all the equations the same way the people who discovered them did. It was pretty cool.


What school/teacher? I'd be interested in finding a curriculum or any references about that. I have daughters and I'm dying to find a good way to present science to them.


Prof. Peter Stein at Cornell. The class is Physics 202. I can email you the class notes and problem sets if you'd like. I'll also email him and ask if he's thought about publishing the course materials online.


Can you just put them online somewhere?


http://www.alexkrupp.com/Phys202.zip

If you glance at my answers to problem sets 4, 5, and 8 then you'll get the general idea of what the course involves. The problem sets are really fun because you're forced to play with the numbers and really understand the material. It might not be as hard as a traditional physics course, but it's every bit as rigorous in the real sense of the word. It's brilliance is that it really stresses the fundamentals, probably more so than any other physics material in existence. This is important because, as one professional Go player said after observing a Japanese baseball game,

"In every confrontation with a real American professional team it seems that what we need to learn from them, besides their technique of course, is how uniformly faithful their players are to the fundamentals. Faithfulness to the fundamentals seems to be a common thread linking professionalism in all areas." (T. Kageyama)


Sounds like a great course. How exactly did that course fit into the Cornell curriculum? Was it an elective for most people or could it substitute for algebra/calc based physics?


Cornell has two sets of requirements. The requirements for your college within the university, and the requirements for your major. This course fulfilled the distribution requirements for the college I was in. However, it wouldn't necessarily count as one of the required science classes for your major. So, for example, if you were a physics student then it wouldn't count toward your major, so therefor it would make more sense to take another class that you could count toward both your major and your college distribution requirement. Sorry if this is unclear, it's kind of complicated.


Thanks, that made sense.


Awesome... thanks!


Thanks, I'd appreciate that. My email is in my profile.


Take a look at "The Road to Reality."

It starts assuming nothing more than basic arithmetic, and builds up to differntial equations, tensors, string theory, quantum relativity, etc.

I'm warning you up front that it's very math heavy. It takes some work to get through, but its worth it.


[deleted]


If you look here http://en.wikipedia.org/wiki/Inductive_reasoning you will see inductive reasoning is all about drawing broad conclusions from limited evidence.

In this context, the tens or thousands of experiments are generalized to support some "theory"

Every time i touch ice it is cold. By induction, all ice is cold. that's not "proof" but it's a great theory of ice.


If you read Hume or Popper you will see that inductive reasoning is logically impossible, and therefore can't actually be what people do.


I think we do this, or something very much like it: http://en.wikipedia.org/wiki/Bayesian_inference

I'll agree that we can set up a game we play with symbols and rules that doesn't permit inductive reasoning as valid. I can't agree that "what people do" has anything to do with logic.


"During the twentieth century, thinkers such as Karl Popper and David Miller have disputed the existence, necessity and validity of any inductive reasoning, including probabilistic (Bayesian) reasoning[citation n" http://en.wikipedia.org/wiki/Inductive_reasoning

On the other hand E Yudkowsky claims Poppers philosophy is a subset of Bayes Theorem(or wait dont take my words for that I might have misinterpretad), here is the link and quote: http://yudkowsky.net/bayes/bayes.html

The Bayesian revolution in the sciences is fueled, not only by more and more cognitive scientists suddenly noticing that mental phenomena have Bayesian structure in them; not only by scientists in every field learning to judge their statistical methods by comparison with the Bayesian method; but also by the idea that science itself is a special case of Bayes' Theorem; experimental evidence is Bayesian evidence. The Bayesian revolutionaries hold that when you perform an experiment and get evidence that "confirms" or "disconfirms" your theory, this confirmation and disconfirmation is governed by the Bayesian rules. For example, you have to take into account, not only whether your theory predicts the phenomenon, but whether other possible explanations also predict the phenomenon. Previously, the most popular philosophy of science was probably Karl Popper's falsificationism - this is the old philosophy that the Bayesian revolution is currently dethroning. Karl Popper's idea that theories can be definitely falsified, but never definitely confirmed, is yet another special case of the Bayesian rules; if p(X|A) ~ 1 - if the theory makes a definite prediction - then observing ~X very strongly falsifies A. On the other hand, if p(X|A) ~ 1, and we observe X, this doesn't definitely confirm the theory; there might be some other condition B such that p(X|B) ~ 1, in which case observing X doesn't favor A over B. For observing X to definitely confirm A, we would have to know, not that p(X|A) ~ 1, but that p(X|~A) ~ 0, which is something that we can't know because we can't range over all possible alternative explanations. For example, when Einstein's theory of General Relativity toppled Newton's incredibly well-confirmed theory of gravity, it turned out that all of Newton's predictions were just a special case of Einstein's predictions.

You can even formalize Popper's philosophy mathematically. The likelihood ratio for X, p(X|A)/p(X|~A), determines how much observing X slides the probability for A; the likelihood ratio is what says how strong X is as evidence. Well, in your theory A, you can predict X with probability 1, if you like; but you can't control the denominator of the likelihood ratio, p(X|~A) - there will always be some alternative theories that also predict X, and while we go with the simplest theory that fits the current evidence, you may someday encounter some evidence that an alternative theory predicts but your theory does not. That's the hidden gotcha that toppled Newton's theory of gravity. So there's a limit on how much mileage you can get from successful predictions; there's a limit on how high the likelihood ratio goes for confirmatory evidence.

On the other hand, if you encounter some piece of evidence Y that is definitely not predicted by your theory, this is enormously strong evidence against your theory. If p(Y|A) is infinitesimal, then the likelihood ratio will also be infinitesimal. For example, if p(Y|A) is 0.0001%, and p(Y|~A) is 1%, then the likelihood ratio p(Y|A)/p(Y|~A) will be 1:10000. -40 decibels of evidence! Or flipping the likelihood ratio, if p(Y|A) is very small, then p(Y|~A)/p(Y|A) will be very large, meaning that observing Y greatly favors ~A over A. Falsification is much stronger than confirmation. This is a consequence of the earlier point that very strong evidence is not the product of a very high probability that A leads to X, but the product of a very low probability that not-A could have led to X. This is the precise Bayesian rule that underlies the heuristic value of Popper's falsificationism.

Similarly, Popper's dictum that an idea must be falsifiable can be interpreted as a manifestation of the Bayesian conservation-of-probability rule; if a result X is positive evidence for the theory, then the result ~X would have disconfirmed the theory to some extent. If you try to interpret both X and ~X as "confirming" the theory, the Bayesian rules say this is impossible! To increase the probability of a theory you must expose it to tests that can potentially decrease its probability; this is not just a rule for detecting would-be cheaters in the social process of science, but a consequence of Bayesian probability theory. On the other hand, Popper's idea that there is only falsification and no such thing as confirmation turns out to be incorrect. Bayes' Theorem shows that falsification is very strong evidence compared to confirmation, but falsification is still probabilistic in nature; it is not governed by fundamentally different rules from confirmation, as Popper argued.

So we find that many phenomena in the cognitive sciences, plus the statistical methods used by scientists, plus the scientific method itself, are all turning out to be special cases of Bayes' Theorem. Hence the Bayesian revolution.


In this passage Yudkowsky only discusses a very small portion of Popper's philosophy.

The most apparent problem with Yudkowsky's position is: where do probability estimates and hypotheses come from in the first place? That is an issue which Popper's philosophy provides answers for, and Bayesian Inference does not.


Bayesian Inference does not explain how new knowledge is created.


Is it really going to fall apart because you aren't born with all the priors you would theoretically need?


I don't understand your question. But the point of science is to create new knowledge, so it seems highly relevant.


I spend some of my time teaching at a private university. During a discussion with one student he said, "All I really know about Plato is that he thought the forms were transcendent whereas Aristotle thought the forms were immanent."

"Oh," said I, "that's interesting. What does it mean for a form to be transcendent or immanent?"

"I don't know, I just know that's what they thought."

We then had a discussion about whether or not he really knew that, and whether or not he was OK with paying over $25k a year to learn how to be a parrot.

The sad thing: he was OK with it, and probably rightly so. He realized that mostly what people want are parrots and you can get paid quite a decent amount of money for being a good parrot. Not a bad gig if you're fine being a cog. There are certain comforts it provides.


This sadly is how the world works. In school, even in high school and college, the students who guess the teacher's password are rewarded. The (few) students who make an effort to discover and think through something on their own are criticized for wandering from the beaten path, even on the occasions when they're actually right.

High school science competitions (school science fairs, all the way upto Siemens-Westinghouse and Intel STS) particularly guilty of this.


"The (few) students who make an effort to discover and think through something on their own are criticized for wandering from the beaten path, even on the occasions when they're actually right."

I remember I just -couldn't- accept that we can't take square roots of negative numbers. I mean, sure, two negatives multiplied together is a positive, but...there HAS to be a way! Numbers, to my 6th grade mind, weren't arbitrarily going to deny your ability to manipulate them.

Fortunately for me, I happened to be reading a book on Mandlebrot and in the course of discussing his work the author introduces the concept of imaginary numbers. So the next day in class I asked: "Are you suuuure?"

"Yes, I'm sure."

"Well, I was reading in this book and so-and-so says you can take the square root of negative numbers, you get 'i' for the root of -1 and..."

"Why don't you go wait outside."

Where I got told that although I move to the beat of a different drummer, when I'm in her class, I really just need to be quiet. That's when public education and I had a falling out from which our relationship has never recovered.

All of that to say: I think part of the 'problem' is that kids who do try to think on their own are often in a situation where they're challenging accepted rules without the social grace to do so constructively. But rather than being taught the social skills, they're just shut down. Very sad, actually. They could be taught how to 'wander well'.


Isn't sqrt(-1) = i an axiom though, and not a theorem? In other words, if you just started with the integers and the basic rules of arithmetic, you will never run into the need for imaginary numbers. It's only after you posit their existence that a lot of cool results flow from it.

Am I wrong there?


I don't really know how to interpret your sentence. Complex numbers are a field in which you do 'basic rules of arithmetic'. You can't do the 'basic rules of arithmetic' on integers.

i is defined as sqrt(-1), if that answers your question.


I should have said "Real Numbers" instead of "Integers", sorry.

My point is that sqrt(-1) = i is an extra assumption that you don't need to make in order to derive the math you do in school. If you don't make that assumption, you can still derive a lot of math. If you do make that assumption, then you can also derive the theorems of complex analysis.

I think that's true, anyway. I could be wrong.

It just seems strange to me that someone would be bothered by the lack of negative square roots, since their existence is never derived, only assumed. But then again, people's minds work very differently, especially in Mathematics.

I was the opposite in Math class. I fought accepting "i" when they tried to teach it to us since it seemed so arbitrary and contrived to me. This was before I discovered that all Math is arbitrary and that there is no "real math", anyway.


As the "Road to Reality" explains, if you study Quantum mechanics, complex numbers become every bit as "real" as the integers or real numbers. You just can't explain some of the quantum mechanical phenomena or concepts without using complex numbers at all.


it's as you say. it is contrived. but no more contrived than the basic integers. these are abstract systems that we have contrived in ways that are most useful to us. the natural numbers just seem more "real" because everyone in most cultures deals with "quantity" and "magnitude"


When I was learning complex numbers, the teacher didn't want us to assume sqrt(-1) = i, we had to go like this:

sqrt(-1) = sqrt(e^Pi) = e^(Pi/2) = i

So the new concept was really a 2D plane for numbers, not a new definition for sqrt of negative integers.


Actually, even the definition of this form requires i, because it's

e^(i)(pi/2) = i.

The simple assumption that sqrt(-1) = i is problematic because you can get something like

i^2 = (sqrt(-1))^2 = sqrt((-1)^2) = 1.

The wonder and surprise of complex numbers is that assuming seemingly arbitrary properties of a constant like i lead to a number of deep and beautiful results.


overcomingbias is one of those websites you should read every article in, like paul graham's essays


yeah, although you'd have a lot more work to do. Not only are there multiple posts a day but some of Elizer's have dependency trees reaching months back!


I remember the first time I got an F on an assignment even though I'd gotten the correct answer to every problem.

I hadn't reached my correct answers the "right" way.


I loved calculus because about 4 weeks of teaching provided the foundation for everything else. My tests usually involved me having to derive something that was apparently derived in the book and then 'applied' on practice questions that got the 'right' way of doing things in by rote. I lost points for that.

I would also lose points for "not showing my work", but if you can't factor, say, x^2-9 in your head, maybe you have deeper issues.


If the teacher was teaching a process, and looking for evidence that you understood that process, it seems entirely reasonable to not count your work.


I had simply taken shortcuts I'd seen later in the book for factoring quadratics. They wanted to see it written exactly as it appeared in the current examples.

I realized that math in school was more like one of those simon games where you just mindlessly press the arbitrary sequence of lights you just saw, than actual mathematics. This analogy helped me suffer through "school math" while I perused real math on my own.


Ugh I hate that. I got marked down a letter grade on a math test once for not showing enough intermediate steps even though my answers were correct.

I also once got an answer marked wrong because the teacher couldn't understand my intermediate steps, even though the answer (and steps) were actually correct. When I showed him, he refused to change my grade. That was at University.


I've been thinking along similar lines since I've been interviewing; far too many interviewers are only looking for the password.


Only tangentially related, but years ago, in elementary school, I decided to guess the password after the school restricted access to SimCity. I got it right on the first try, it was just the school name, but I told everyone 'I hacked it', for major props.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: