Jensen–Renyi’s–Tsallis Fuzzy Divergence Information Measure with its Applications
Ratika Kadian , Satish Kumar
Communications in Mathematics and Statistics ›› 2022, Vol. 10 ›› Issue (3) : 451 -482.
In this paper, we have characterized the sum of two general measures associated with two distributions with discrete random variables. One of these measures is logarithmic, while others contains the power of variables, named as joint representation of Renyi’s–Tsallis divergence measure. Then, we propose a divergence measure based on Jensen–Renyi’s–Tsallis entropy which is known as a Jensen–Renyi’s–Tsallis divergence measure. It is a generalization of J-divergence information measure. One of the silent features of this measure is that we can allot the equal weight to each probability distribution. This makes it specifically reasonable for the study of decision problems, where the weights could be the prior probabilities. Further, the idea has been generalized from probabilistic to fuzzy similarity/dissimilarity measure. Besides the validation of the proposed measure, some of its key properties are also studied. Further, the performance of the proposed measure is contrasted with some existing measures. At last, some illustrative examples are solved in the context of clustering analysis, financial diagnosis and pattern recognition which demonstrate the practicality and adequacy of the proposed measure between two fuzzy sets (FSs).
Fuzzy sets / Divergence measure / Similarity measure / Clustering analysis / Financial diagnosis
| [1] |
|
| [2] |
|
| [3] |
|
| [4] |
|
| [5] |
|
| [6] |
|
| [7] |
|
| [8] |
|
| [9] |
|
| [10] |
|
| [11] |
|
| [12] |
|
| [13] |
|
| [14] |
|
| [15] |
|
| [16] |
|
| [17] |
|
| [18] |
|
| [19] |
|
| [20] |
|
| [21] |
|
| [22] |
|
| [23] |
|
| [24] |
Kabir, S., Wanger, C., Havens, T.C., Anderson, D.T.: A bidirectional subsethood based similarity measure for fuzzy sets. In 2018 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pp. 1–7, (2018) |
| [25] |
|
| [26] |
|
| [27] |
|
| [28] |
|
| [29] |
|
| [30] |
|
| [31] |
|
| [32] |
Renyi, A.: On measures of entropy and information. In: Proceedings of 4th Bakley Symposium on Mathematics and Statistics and Probability, University of California Press, vol. 1, p. 547 (1961) |
| [33] |
|
| [34] |
|
| [35] |
Santos-Rodriguez, R., Garcia-Garcia, D., Cid-Sueiro, J.: Cost-sensitive classification based on Bregman divergences for medical diagnosis, In M.A. Wani, editor, Proceedings of the 8th International Conference on Machine Learning and Applications (ICMLA’09), Miami beach, Fl., USA, December 13–15, pp. 551–556 (2009) |
| [36] |
|
| [37] |
Szmidt, E., kacprzyk, J.: A similarity measure for intuitionistic fuzzy sets and its application in supporting medical diagnostic reasoning. In Proceedings of the 7th International Conference on Artificial Intelligence and Soft Computing (ICAISC’04), pp. 388–393 (2004) |
| [38] |
|
| [39] |
Wondie, L., Kumar, S.: A joint representation of Renyi’s–Tsallis entropy with application in coding theory. Int J. Math. Math. Sci. 2017, Article ID 2683293 (2017). https://doi.org/10.1155/2017/2683293 |
| [40] |
|
| [41] |
|
| [42] |
|
| [43] |
|
| [44] |
|
| [45] |
|
| [46] |
|
| [47] |
|
| [48] |
|
/
| 〈 |
|
〉 |