Information geometry in optimization, machine learning and statistical inference

Shun-ichi AMARI,

PDF(374 KB)
PDF(374 KB)
Front. Electr. Electron. Eng. ›› 2010, Vol. 5 ›› Issue (3) : 241-260. DOI: 10.1007/s11460-010-0101-3
Research articles
Research articles

Information geometry in optimization, machine learning and statistical inference

  • Shun-ichi AMARI,
Author information +
History +

Abstract

The present article gives an introduction to information geometry and surveys its applications in the area of machine learning, optimization and statistical inference. Information geometry is explained intuitively by using divergence functions introduced in a manifold of probability distributions and other general manifolds. They give a Riemannian structure together with a pair of dual flatness criteria. Many manifolds are dually flat. When a manifold is dually flat, a generalized Pythagorean theorem and related projection theorem are introduced. They provide useful means for various approximation and optimization problems. We apply them to alternative minimization problems, YingYang machines and belief propagation algorithm in machine learning.

Keywords

information geometry / machine learning / optimization / statistical inference / divergence / graphical model / Ying-Yang machine

Cite this article

Download citation ▾
Shun-ichi AMARI,. Information geometry in optimization, machine learning and statistical inference. Front. Electr. Electron. Eng., 2010, 5(3): 241‒260 https://doi.org/10.1007/s11460-010-0101-3
PDF(374 KB)

Accesses

Citations

Detail

Sections
Recommended

/