Aug 26, 2023·edited Aug 26, 2023Liked by Joel David Hamkins

Is there a proof the irrationality of sqrt(2) that generalizes further than “integer roots of integers that are not integers are irrational”?

Conway and I gave one in our paper “Extreme Proofs”. Originally due to Laczkovich:

For positive integers n, (sqrt(2)- 1)^n has the form a(sqrt(2))+ b where a and b are integers, not necessarily positive. But if sqrt(2) were rational with denominator D, then for integral a,b, a(sqrt(2)) + b would be rational with denominator dividing D, and so could not approach a limit of 0, as (sqrt(2)-1)^n must, since 0<(sqrt(2) - 1)<1.

The proof can be generalized to show that any real algebraic integer (root of a polynomial with integer coefficients and a highest-degree coefficient of 1) is either integeral or irrational. All you have to note is that higher powers of the root can be replaced recursively by sums and differences of lower powers, because the root satisfies a monic polynomial.

It’s also easy to see that any algebraic irrational is an integer divisor of an algebraic integer (if the leading coefficient is ax^n, multiply the whole polynomial by a^(n-1), and set y=ax, and you have a monic polynomial for y).

For any SPECIFIC algebraic irrational, there is a trivially easy proof that it is, by using the rational root theorem: for any rational root, the denominator must divide the leading coefficient and the numerator must divide the constant term, so you just check all possibilities.

(“Easy”, but not short—you might have difficulty factoring those two coefficients if they are large, and if on the other hand they’re not difficult to factor they may have lots of divisors so you will have a lot of checking!)

## A classical beginning

edited Aug 26, 2023Is there a proof the irrationality of sqrt(2) that generalizes further than “integer roots of integers that are not integers are irrational”?

Conway and I gave one in our paper “Extreme Proofs”. Originally due to Laczkovich:

For positive integers n, (sqrt(2)- 1)^n has the form a(sqrt(2))+ b where a and b are integers, not necessarily positive. But if sqrt(2) were rational with denominator D, then for integral a,b, a(sqrt(2)) + b would be rational with denominator dividing D, and so could not approach a limit of 0, as (sqrt(2)-1)^n must, since 0<(sqrt(2) - 1)<1.

The proof can be generalized to show that any real algebraic integer (root of a polynomial with integer coefficients and a highest-degree coefficient of 1) is either integeral or irrational. All you have to note is that higher powers of the root can be replaced recursively by sums and differences of lower powers, because the root satisfies a monic polynomial.

It’s also easy to see that any algebraic irrational is an integer divisor of an algebraic integer (if the leading coefficient is ax^n, multiply the whole polynomial by a^(n-1), and set y=ax, and you have a monic polynomial for y).

For any SPECIFIC algebraic irrational, there is a trivially easy proof that it is, by using the rational root theorem: for any rational root, the denominator must divide the leading coefficient and the numerator must divide the constant term, so you just check all possibilities.

(“Easy”, but not short—you might have difficulty factoring those two coefficients if they are large, and if on the other hand they’re not difficult to factor they may have lots of divisors so you will have a lot of checking!)