A hundredth of a bit of extra entropy

There are two ways to calculate the amount of information in one term of a continued fraction:

  • The entropy of the Gauss-Kuzmin distribution is about 3.4325 bits.
  • Twice the logarithm of the Khinchin-Lévy constant is about 3.4237 bits.

These differ by about 0.0088 bits. It took me a while to figure out why they were different at all, and now I’m surprised by how close they are.

The rest of this post is on LessWrong because it has equations and spoiler tags.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s