
Half of a Coin: Negative Probabilities [pdf] - jmount
http://www.wilmott.com/pdfs/100609_gjs.pdf
======
Xcelerate
As the article hints toward the end, if you extend probabilities into the
domain of complex numbers, you basically get quantum mechanics.

Scott Aaronson has a well-written and thorough explanation on the matter:
[http://www.scottaaronson.com/democritus/lec9.html](http://www.scottaaronson.com/democritus/lec9.html)

~~~
cperciva
_if you extend probabilities into the domain of complex numbers, you basically
get quantum mechanics_

To me, the most fundamental question of quantum physics is this: Why not
quaternions? As a mathematician, the distinguishing characteristic of C is
that it is the algebraic closure of R; but I see no physical relevance to
that.

~~~
tel
Is there not a model of QM which uses quaternions probabilities? Is there a
place where it breaks down?

~~~
jaekwon
I remember reading things online where people claim that the natural world
(probably including what's known in quantum mechanics) is better explained
with quaternions.

~~~
judk
It's not broadly true. Quaternions are mainly useful for math on the surface
of a sphere.

------
grondilu
« Every other number is negative »

I don't understand this part. Other than what? Is there a translation error or
something?

PS. Ok, got it.
[http://dictionary.reference.com/browse/every%20other](http://dictionary.reference.com/browse/every%20other)

"Every other" is an english idiom to indicate alternation. I did not know it.

~~~
jaekwon
No error, it means every other coefficient in the p.g.f. is negative, which is
like saying that certain outcomes have negative probability. In the p.g.f. for
the double coin example, the coefficient of the monomial for z^0 is 1/4,
meaning that the probability of two coins flipping to both tails is 1/4\. If
the coefficient were negative there, it doesn't make sense intuitively but
that's the "magic" when it comes to these magical half coins.

~~~
grondilu
I'm sorry I was not clear. It's not the negative part I don't get, it's the
"every other number". Other than what? I'm not an english native speaker so it
might be a math-related english expression I don't know. Is it?

~~~
mhlakhani
"every other number" == "every second number", i.e. the first, third, fifth
... coefficients are positive, while the second, fourth, sixth .. coefficients
are negative. Hope this helps.

------
mturmon
Thanks for this link.

Many investigators have been dissatisfied with the strong assumptions that
conventional measure theoretic probability requires you to make. The community
that I'm most familiar with that is stretching these bounds is the imprecise
probability people,
[http://www.sipta.org/index.php?id=cs](http://www.sipta.org/index.php?id=cs)

They have a conference every other year. The simplest generalization is to
interval valued probabilities, but there are several other interesting
theories.

There are some phenomena, such as a failure of long term averages of some
actual physical-world stochastic processes to converge, that seem to indicate
that conventional mathematical probability does not predict their behavior
correctly. I.e., it is a theorem that long term averages of stationary
processes should converge, and these don't.

~~~
dnautics
_There are some phenomena, such as a failure of long term averages of some
actual physical-world stochastic processes to converge_

That's a general property of levy-stable probability distributions, which
occur all over the place, and you don't have to resort to unconventional
probability to explain most of these processes.

~~~
mturmon
No, this is different. There is a theorem, which you can find in Loeve (and
other places too I'm sure), that any stationary stochastic process with finite
first moment will have long-term averages that converge. IIRC, you don't even
need finite second moments. In any case, it has great generality, and
basically all you need is stationarity.

Presumably a physical system that has stable components in a stable
environment will be stationary. You can build a simple electrical circuit of a
certain type and compute its long term averages and they do not, in this case,
converge. Not slow convergence, like you would see in a process with long
range dependence or heavy tails, but lack of convergence, despite averaging
over very long times.

I have not done is my experiment myself, but my PhD advisor, who did some of
the early work on unconventional axiomatizations of probability, did.

~~~
dnautics
levy-stable distributions with an alpha less than or equal to 1, (e.g. cauchy
distribution) don't have finite first moments.

The cauchy distribution is most simply explained with the blind archer
analogy. An archer is blindfolded and placed in a random orientation. (Let's
restrict it to [0-pi] radians for simplicity). She then shoots the arrow in
the direction she's pointing until it hits an infinitely long, straight wall.
What is the position along the wall where the arrow strikes?

The CDF is the normalized arctan(x) - should be easy to visualize, but the
integral of x arctan x (the first moment) is undefined.

~~~
mturmon
I'm aware that there are distributions without finite first moments; that's
why I specified it. But a real electrical signal cannot have infinite expected
value, so that's not an explanation.

Note that your archer analogy implicitly requires infinite expected energy
input.

------
wirrbel
The author mentions two interpretations of probability, the frequentist and
the bayesian one. With both interpretations, a negative probability does not
make sense. The frequentist thinks of the ratio number of occurences in the
limit of an infinite number of experiments. Negative probabilities would imply
negative numbers of occurences (which does not really make sense).

In the bayesian interpretation, probabilities are somewhat of a generalization
of boolean logic. Based on the assumption that there is True (1) and a False
(0), probabilities represent the degree of reasonable believe that a
proposition is true. I.e. probabilities are bound in (0, 1).

While in math it often turns out that one should not dismiss a certain
extension of a calculus (complex numbers are the textbook example, they are
not immediately recognizable as "natural" but they work flawlessly and give
interesting insights), I really do not get the point here, why would we need
negative probabilities? What could we gain from using them?

The example brings up an example from the theory of interest, it seems that he
identifies an expectancy value-like equation with negative probabilities. I
wonder if this could be interpreted as a quasi-discrete case of probability-
densities, which can be indeed negative.

~~~
wirrbel
really the link posted by another commenter here
[http://www.scottaaronson.com/democritus/lec9.html](http://www.scottaaronson.com/democritus/lec9.html)
is worth a read

------
quarterwave
The probabilities discussed in this paper seem to have an L1 flavour
(Manhattan distance) while those in quantum mechanics come from L2 functions
(Pythagorean distance).

------
jaekwon
On page 67, "According to the binomial theorem," ...

What follows on the right hand side, I'm not sure where that came from.

According wikipedia: (1+x)^n = \sum_{k=0}^n {n \choose k}x^k. If you plug in
1/2 for n, you get a sum from n=0 to n=1/2, not n=0 to n=infinity as in the
paper. So, this is confusing :)

~~~
arjie
He meant the Binomial series (i.e. the Taylor series about the origin of
(1+x)^n.
[https://en.wikipedia.org/wiki/Binomial_series](https://en.wikipedia.org/wiki/Binomial_series)

------
loup-vaillant
I'm not sure you should be allowed to call "probability" a number that can
fall outside the [0,1] range (Or even the ]0,1[ range).

Without even reading the paper, I'm pretty sure it doesn't talk about
probabilities, but about some related notion.

~~~
stared
If you read it, yes, it is a generalization.

------
jjgreen
Interesting example of a pdf which is borked in the Firefox built-in reader,
the summations ("Sigmas") are vertically stretched, this not seen in Evince
for example.

