Information geometry in optimization, machine learning and statistical inference

Front. Electr. Electron. Eng. ›› 2010, Vol. 5 ›› Issue (3) : 241 -260.

PDF (374KB)
Front. Electr. Electron. Eng. ›› 2010, Vol. 5 ›› Issue (3) : 241 -260. DOI: 10.1007/s11460-010-0101-3
Research articles
Research articles

Information geometry in optimization, machine learning and statistical inference

Author information +
History +
PDF (374KB)

Abstract

The present article gives an introduction to information geometry and surveys its applications in the area of machine learning, optimization and statistical inference. Information geometry is explained intuitively by using divergence functions introduced in a manifold of probability distributions and other general manifolds. They give a Riemannian structure together with a pair of dual flatness criteria. Many manifolds are dually flat. When a manifold is dually flat, a generalized Pythagorean theorem and related projection theorem are introduced. They provide useful means for various approximation and optimization problems. We apply them to alternative minimization problems, YingYang machines and belief propagation algorithm in machine learning.

Keywords

information geometry / machine learning / optimization / statistical inference / divergence / graphical model / Ying-Yang machine

Cite this article

Download citation ▾
null. Information geometry in optimization, machine learning and statistical inference. Front. Electr. Electron. Eng., 2010, 5(3): 241-260 DOI:10.1007/s11460-010-0101-3

登录浏览全文

4963

注册一个新账户 忘记密码

References

AI Summary AI Mindmap
PDF (374KB)

2086

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/