$\gamma $-order normal distribution,Multivariate Student’s t-distribution,Multivariate Laplace distribution" /> $\gamma $-order normal distribution" /> $\gamma $-order normal distribution,Multivariate Student’s t-distribution,Multivariate Laplace distribution" />
Information Divergence and the Generalized Normal Distribution: A Study on Symmetricity
Thomas L. Toulias , Christos P. Kitsos
Communications in Mathematics and Statistics ›› 2021, Vol. 9 ›› Issue (4) : 439 -465.
Information Divergence and the Generalized Normal Distribution: A Study on Symmetricity
This paper investigates and discusses the use of information divergence, through the widely used Kullback–Leibler (KL) divergence, under the multivariate (generalized) $\gamma $-order normal distribution ($\gamma $-GND). The behavior of the KL divergence, as far as its symmetricity is concerned, is studied by calculating the divergence of $\gamma $-GND over the Student’s multivariate t-distribution and vice versa. Certain special cases are also given and discussed. Furthermore, three symmetrized forms of the KL divergence, i.e., the Jeffreys distance, the geometric-KL as well as the harmonic-KL distances, are computed between two members of the $\gamma $-GND family, while the corresponding differences between those information distances are also discussed.
Kullback–Leibler divergence / Jeffreys distance / Resistor-average distance / $\gamma $-order normal distribution')">Multivariate $\gamma $-order normal distribution / Multivariate Student’s t-distribution / Multivariate Laplace distribution
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
Johnson, D., Sinanovic, S.: Symmetrizing the Kullback-Leibler distance. IEEE Trans. Inf, Theory (2001). https://scholarship.rice.edu/handle/1911/19969 |
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
/
| 〈 |
|
〉 |