There are two ways to calculate the amount of information in one term of a continued fraction:
- The entropy of the Gauss-Kuzmin distribution is about 3.4325 bits.
- Twice the logarithm of the Khinchin-Lévy constant is about 3.4237 bits.
These differ by about 0.0088 bits. It took me a while to figure out why they were different at all, and now I’m surprised by how close they are.
The rest of this post is on LessWrong because it has equations and spoiler tags.