Hyperbolic Information Geometry
Information geometry gives a way to associate a geometry to a parametrized family of probability distributions. As suggested by the name, this notion has roots in information theory, and yields surprising insights into many well-known statistical distributions. In this video, I discuss several examples of probability families whose induced geometry turns out to be hyperbolic. Artwork: Entropy Increases, Clayton Shonkwiler https://shonkwiler.org/ A half-plane rendition of Circle Limit IV, Arkadiusz Jadczyk http://arkadiusz-jadczyk.eu/blog/2017/04/eshers-limit-circle-iv-rendered-complex-upper-half-plane/ Normal distributions, Casey Chu http://caseychu.github.io/normal-distribution-manifold/ Background on Information Geometry: Smith, Lewis. “A gentle introduction to information geometry” https://www.robots.ox.ac.uk/~lsgs/posts/2019-09-27-info-geom.html Nielsen, Frank. "An elementary introduction to information geometry." Entropy 22, no. 10 (2020): 1100. https://arxiv.org/abs/1808.08271 Amari, Shun-ichi, and Hiroshi Nagaoka. Methods of information geometry. Vol. 191. American Mathematical Soc., 2000. References for the main results: In Information Geometry and Its Applications, Amari remarks that Hotelling seems to have observed in 1929 that any location scale family has a Fisher metric with constant negative curvature. However, there does not seem to be a contemporary reference. This fact was rediscovered by Amari in 1959 and he is often credited with the result. Many other exponential families were considered in the following lecture notes: Amari, S. I., Barndorff-Nielsen, O. E., & Kass, R. E. (1987). Differential geometry in statistical inference. IMS. A detailed study of the Inverse Gaussian family can be found in the following paper, Villarroya, A., & Oller, J. M. (1993). Statistical tests for the inverse gaussian distribution based on Rao distance. Sankhyā: The Indian Journal of Statistics, Series A, 80-103. Khan, G., & Zhang, J. ( (2022). A hall of statistical mirrors. Asian Journal of Mathematics, Vol. 26, No. 6, pp. 809-846 Chapters 0:00 Introduction 1:22 What is information geometry? 2:39 Some initial counterexamples and background 5:15 Normal distributions and the Fisher metric 7:37 Negative trinomial distributions 10:41 A diversion on statistical mirror symmetry 12:28 Inverse Gaussian distributions 14:48 Isometries of the inverse Gaussian family 16:06 Conclusion and a slower derivation of the Fisher metric Technical notes: At 4:14, the paper which shows that there are infinitely many Hessian structures on hyperbolic space is the following: Kito, H. (1999). On Hessian structures on the Euclidean space and the hyperbolic space. Any Hessian manifold can be realized as a statistical manifold, but this doesn't directly show that there are infinitely many exponential families with hyperbolic geometry. At 4:45, the question of determining when two Riemannian manifolds are locally isometric is known as the “equivalence problem,” which can be quite difficult to solve in general. There are some algorithms which can be applied to general manifolds, but by focusing on surfaces whose curvature is constant, we were able to avoid this technical difficulty entirely. For a background on Riemannian geometry and how the metric determines the geometry, I highly recommend John Lee's "Introduction to Riemannian Manifolds." It might be a good idea to take a look at "Introduction to Smooth Manifolds" first though. The intro music was made by Kevin Rosales. The outro music was written by me and you can find sheet music here: https://differentialgeometri.files.wordpress.com/2021/07/fugue-excerpt-july-25.pdf #InformationGeometry #mathematics
Download
1 formatsVideo Formats
Right-click 'Download' and select 'Save Link As' if the file opens in a new tab.