Research articles

Information geometry in optimization, machine learning and statistical inference

Expand
  • RIKENBrainScienceInstitute,Saitama351-0198,Japan;

Published date: 05 Sep 2010

Abstract

The present article gives an introduction to information geometry and surveys its applications in the area of machine learning, optimization and statistical inference. Information geometry is explained intuitively by using divergence functions introduced in a manifold of probability distributions and other general manifolds. They give a Riemannian structure together with a pair of dual flatness criteria. Many manifolds are dually flat. When a manifold is dually flat, a generalized Pythagorean theorem and related projection theorem are introduced. They provide useful means for various approximation and optimization problems. We apply them to alternative minimization problems, YingYang machines and belief propagation algorithm in machine learning.

Cite this article

Shun-ichi AMARI, . Information geometry in optimization, machine learning and statistical inference[J]. Frontiers of Electrical and Electronic Engineering, 2010 , 5(3) : 241 -260 . DOI: 10.1007/s11460-010-0101-3

Outlines

/