Hacker News new | comments | show | ask | jobs | submit login
Golden powers are nearly integers (johndcook.com)
277 points by adamnemecek on Mar 22, 2017 | hide | past | web | favorite | 36 comments



Specifically, they're nearly Lucas numbers. For those unfamiliar, the Lucas numbers are defined by the recursion L_0 = 2, L_1 = 1, and L_n = L_{n-1} + L_n. (I.e., the Fibonacci recursion, but with different starting values.) They can also be given explicitly by L_n = phi^n + (-1/phi)^n. Since (-1/phi)^n goes rapidly to zero, phi^n will be very close to L_n. More specifically, for n>=2, (1/phi)^n < 1/2, so L_n will be the nearest integer to phi^n.

The Fibonacci numbers, meanwhile, can be given by the formula F_n = (phi^n - (-1/phi)^n) / sqrt(5). So a similar thing holds for them.

More generally, phi is a Pisot number: https://en.wikipedia.org/wiki/Pisot%E2%80%93Vijayaraghavan_n... Any Pisot number will have its powers approach integers at an exponential rate. (In that, the distance to the nearest integer decreases exponentially.)


Another way to get at this:

  φ^1 = φ
  φ^2 = φ + 1 [this is the definition of φ]
  φ^3 = φ(φ + 1) = 2φ + 1
  φ^4 = φ(2φ + 1) = 3φ + 2
  φ^5 = φ(3φ + 2) = 5φ + 3
  φ^6 = φ(5φ + 3) = 8φ + 5
  ...
  φ^n = F(n)φ + F(n-1)
Note that F(n+1)/F(n) → φ as n → ∞. Or flipped around, F(n)φ → F(n+1).

So φ^n = F(n)φ + F(n-1) → F(n+1) + F(n-1) = L(n) as n → ∞. This gets close to being an integer because F(n+1) and F(n-1) are both integers.


I'm almost with you, but you need a bit more.

To show what we want, we must prove the stronger condition that F(n)φ - F(n+1) → 0. Merely knowing that F(n+1)/F(n) → φ only shows that the difference grows more slowly than F(n).


>To show what we want, we must prove the stronger condition that F(n)φ - F(n+1) → 0.

Noting that {F(n+1)/F(n)} is the sequence of convergents of the continued fraction expansion of φ, your stronger condition follows from Theorem 5 at https://en.wikipedia.org/wiki/Continued_fraction .


I've always enjoyed that formula for the Fibonacci numbers, primarily because it's so unintuitive that the right hand side would actually even be an integer. It's a fun exercise to prove that (without resorting to the fact that the Fibonacci numbers are obviously integers). Hint: start with -1/phi = 1-phi and use phi=(sqrt(5)+1)/2.


That is a special case of an even more amazing result.

Given a polynomial P(x), any expression that can be written as a symmetric polynomial in the roots of P, can be written as a polynomial in the coefficients of P. See https://en.wikipedia.org/wiki/Elementary_symmetric_polynomia... for more.

This applies to arbitrary rings. So if P(x) and Q(x) are both polynomials, then any polynomial which is symmetric in the roots of P(x) and also symmetric in the roots of Q(x) can be written as polynomials in the coefficients of Q(x) and P(x). Therefore, for example, (x-sqrt(2)-sqrt(3))(x-sqrt(2)+sqrt(3))(x+sqrt(2)-sqrt(3))(x+sqrt(2)+sqrt(3)) will work out to be a polynomial in the coefficients of x^2-2 and x^2-3. It will therefore be an integer polynomial with those 4 roots.

This is actually the original way in which people proved that the algebraic integers formed a ring.


> start with -1/phi = 1-phi and use phi=(sqrt(5)+1)/2.

This is the same fact twice. (Well, the second part is just one of the two roots of the polynomial equation x^2 = x + 1.)


Doesn't mean you can't use both of them!


Proof by hindsight!


fact comes from to do ... thoese aren't the same operations being done


> L_n = L_{n-1} + L_n

L_{n-1} = 0 ;-)

You probably meant L_n = L_{n-2} + L_n{n-1}


Thanks! This is what happens when you decide to switch indices, I guess. Unfortunately seems I can no longer edit the comment.


Dude seriously explain in simpleton...?


If people are interested in this sort of thing, there's some nice linear algebra underlying the idea.

The Fibonacci recurrence is a linear recurrence, meaning that if you have two different sequences such that

    F[n] = F[n - 1] + F[n - 2]
    G[n] = G[n - 1] + G[n - 2]
then the scalar multiples (multiply every element of F by some value k) and termwise sum of F and G will also solve this recurrence. One nice thing to do then is to search for a base p such that one solution for the recurrence is P[n] = p^n. This gives a formula for p, namely p² = p + 1, which also characterizes the golden ratio φ = (1 + √5)/2 and its negative reciprocal -1/φ = (1 − √5)/2.

Well, remember that the recurrence gives you everything if you also specify the first two numbers F[0] and F[1]. We know that these two sequences that we just computed are

    P1 = [1, (1 + √5)/2, ...]
    P2 = [1, (1 − √5)/2, ...]
and given F0, F1 you can use some simple linear algebra to solve for a and b such that:

    a P1[0] + b P2[0] = F0,
    a P1[1] + b P2[1] = F1.
This gives a closed form solution for the Nth term in terms of these Ps. So the sequences obeying the recurrence actually form a 2-dimensional vector space and these polynomials just happen to be a nice basis for that vector space.

Similarly for the case of the powers of phi, we can use this argument in reverse to say "I know I want P1 plus something times P2 which causes all of the terms to be integers," P2 will then approach zero (since it has modulus less than 1).

For this, we can look at the recurrence and say "oh, we just need the first two terms to be integers and then the recurrence makes everything else integers," so choosing a = 1, b = 1 gives us F[0] = 2, F[1] = 1, and these happen to be called the "Lucas numbers".

Similarly we could expect that the powers of any solution to p² = m p + n, integer m, n, p > 1 will be close to the integers as long as the other solution q lies somewhere in the unit interval -1 < q < 1. Completing the square gives (p − m/2)² = n + m²/4, the solutions are evenly spaced about m/2 with a distance from center to solution of √(m²/4 + n).

So for example the recurrence T[n] = 3 T[n-1] - T[n-2] is solved by the numbers (3 ± √5)/2 and so (3 + √5)/2 must also have this property, whereas T[n] = 3 T[n-1] + T[n-2] gives you (3 ± √13)/2 and has this property too.


Suppose f(x) = x^n + a_{n-1} x^{n-1} + ... + a_1 x + a_0 is a monic polynomial with integer coefficients. Let x_1, ..., x_n be all its (complex) roots. If it happens that the absolute value of x_1 is larger than 1, but absolute values of all x_2, x_3, ..., x_n are smaller than 1, the powers of x_1 will be close to integers. The golden ratio (1+sqrt(5)/2 is a root of x^2 - x - 1, and the other root (1-sqrt(5))/2 is smaller than 1 in absolute value, so the observation follows.

One way to prove the assertion is following: note that for large n, x_1^n + x_2^n + ... + x_3^n is very close in value to x_1^n, because all the other summands are very small in absolute value, since they start out smaller than 1, and they decrease exponentially to 0. But it just so happens that x_1^n + x_2^n + ... + x_3^n is an integer -- indeed, this is a symmetric polynomial with integer coefficients in x_1, ..., x_n, and every such polynomial has a unique expression as a polynomial in elementary symmetric polynomial with integer coefficients -- this is the fundamental theorem of symmetric polynomials[1]. But, if you plug the roots of a given polynomial into elementary symmetric polynomials, you'll just get the coefficients of the original polynomial times (-1)^k, that is, the original a_0, a_1, ..., a_{n-1} -- for example, a_0 is clearly the product (-1)^n x_1 * x_2 * ... * x_n, a_{n-1} is the sum (-1)^1 (x_1 + ... + x_n), a_{n-2} is a sum of two-products (-1)^2 (x_1 x_2 + x_1 x_3 + ... + x_{n-1} x_n) and so on.

For concrete example, note that x_1^2 + ... + x_n^2 = (x_1 + ... + x_n)^2 - 2(x_1 x_2 + ... + x_{n-1} x_n) = a_{n-1} - 2 a_{n-2}, which is an integer. Try expressing x_1^3 + ... + x_n^3 i n terms of a_{n-1}, a_{n-2} and a_{n-3}.

[1] - https://en.wikipedia.org/wiki/Elementary_symmetric_polynomia...


I don't have any advanced math experience, but this is one of the coolest things I've read in a long time - kudos to the author for explaining a couple interesting things about a complex topic in a simple way.


But he didn't explain why it happens at all?


You're right, he didn't explain why it happens. But like I said, he explained "a couple interesting things", showing readers that these interesting patterns exist. To people outside of mathematics like me, it's cool just that these patterns and phenomena are present, even if we don't grasp the math behind why.

That being said, an easy-to-understand explanation of why it happens (to the extent that is possible), like in some of the other comments, would be super interesting as well.


Some more discussion of the same article on Reddit:

https://www.reddit.com/r/math/comments/60uua3/powers_of_the_...


At least he clearly explained the problem (or really phenomenon), so anyone can understand it.


I mean, I'm not knocking the blog. It's fine. But I'm having a hard time identifying what the complex topic was that was given a simple explanation. "Here's a graph" isn't a complex topic, and saying "here's a graph" doesn't really explain it either.


Another interesting relationship between powers and infinity is that on an infinite N-dimensional lattice, taking an infinitely long random walk, starting with dimension 3 you have a 34% chance to return to origin with each successive dimension reducing your chance. The graph of this probability follows a log curve. (dimensions 1 and 2 have a 100% probability of returning to origin)


So what this is saying is, if you don't have "1" in your numerical system, you can recover it by taking φ to the power of infinity and then dividing by infinity?


Any "number system" (be it a field, ring, or multiplicative group) has a "1"


Depends on your definition of "ring"; for some people, a ring need not have a multiplicative identity. But yes, any number system worth its salt has at least a couple of integers in it.


Wouldn't a ring without a multiplicative identity be a rng rather than a ring?


Only if it started as a riiiing...


I was joking. :X


φ is expressed in terms of 1, so if you have "φ" you have "1."


Yeah, it's really obvious too if you know basic algebra. The article doesn't even attempt to explain it, so what's the point, except for blogspam?


I thought I knew basic algebra... I guess not. It definitely is not obvious to me.


To be fair, "algebra" means very different things to different people, so I don't think Grue3 meant to be quite as brash as it may have sounded.


What's the definition of basic algebra that makes that fact obvious? I took 3 semesters of algebra in college (Linear Algebra, Abstract Algebra I and II), and don't see anything obvious about it. But then, neither does Terry Tao:

> "the powers φ, φ2, φ3, … of the golden ratio lie unexpectedly close to integers...


It becomes obvious when you know closed form expression for Fibonacci numbers as a sum of powers of phi (which I'm sure Tao is aware of). I was taught that in high school. The only reason this wouldn't be obvious to Tao is if he was working on much harder problems for so long that he forgot the basic stuff.


It's not basic algebra in the sense of my high-school "Maths B" class. But it is just the kind of thing we did in "Maths C".

Here a hint: use the definition of φ that

  1 = φ - 1/φ
And use it to prove that if

  F[k] =  φ^k + 1/(-φ)^k
is an integer, then

  F[k+1] =  φ^(k+1) + 1/(-φ)^(k+1)
Must also be an integer. Now show that this proves by induction that all F[k] are integers and note what happens to 1/(-φ)^k as k grows to infinity.


That seems like a clever proof, and I understand it and even admire it. But it doesn't look like that the kind of thing that's "obvious" unless you frequently work with this kind of thing.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: