Rotation forest based on multimodal genetic algorithm

Zhe Xu , Wei-chen Ni , Yue-hui Ji

Journal of Central South University ›› 2021, Vol. 28 ›› Issue (6) : 1747 -1764.

PDF
Journal of Central South University ›› 2021, Vol. 28 ›› Issue (6) : 1747 -1764. DOI: 10.1007/s11771-021-4730-x
Article

Rotation forest based on multimodal genetic algorithm

Author information +
History +
PDF

Abstract

In machine learning, randomness is a crucial factor in the success of ensemble learning, and it can be injected into tree-based ensembles by rotating the feature space. However, it is a common practice to rotate the feature space randomly. Thus, a large number of trees are required to ensure the performance of the ensemble model. This random rotation method is theoretically feasible, but it requires massive computing resources, potentially restricting its applications. A multimodal genetic algorithm based rotation forest (MGARF) algorithm is proposed in this paper to solve this problem. It is a tree-based ensemble learning algorithm for classification, taking advantage of the characteristic of trees to inject randomness by feature rotation. However, this algorithm attempts to select a subset of more diverse and accurate base learners using the multimodal optimization method. The classification accuracy of the proposed MGARF algorithm was evaluated by comparing it with the original random forest and random rotation ensemble methods on 23 UCI classification datasets. Experimental results show that the MGARF method outperforms the other methods, and the number of base learners in MGARF models is much fewer.

Keywords

ensemble learning / decision tree / multimodal optimization / genetic algorithm

Cite this article

Download citation ▾
Zhe Xu, Wei-chen Ni, Yue-hui Ji. Rotation forest based on multimodal genetic algorithm. Journal of Central South University, 2021, 28(6): 1747-1764 DOI:10.1007/s11771-021-4730-x

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

SagiO, RokachL. Ensemble learning: A survey [J]. WIREs Data Mining and Knowledge Discovery, 2018, 8(4): e1249

[2]

DietterichT G. Machine-learning research: Four current directions [J]. AI Magazine, 1997, 18497-136

[3]

DongX-b, YuZ-w, CaoW-m, ShiY-f, MaQ-L. A survey on ensemble learning [J]. Frontiers of Computer Science, 2020, 14(2): 241-258

[4]

YuW-k, ZhaoC-H. Online fault diagnosis for industrial processes with bayesian network-based probabilistic ensemble learning strategy [J]. IEEE Transactions on Automation Science and Engineering, 2019, 16(4): 1922-1932

[5]

WangZ-y, LuC, ZhouB. Fault diagnosis for rotary machinery with selective ensemble neural networks [J]. Mechanical Systems and Signal Processing, 2018, 113(SI): 112-130

[6]

XiaC-k, SuC-l, CaoJ-t, LiP. MultiBoost with ENN-based ensemble fault diagnosis method and its application in complicated chemical process [J]. Journal of Central South University, 2016, 23(5): 1183-1197

[7]

SongY, ZhangS-j, HeB, ShaQ-x, ShenY, YanT-h, NianR, LendasseA. Gaussian derivative models and ensemble extreme learning machine for texture image classification [J]. Neurocomputing, 2018, 277(SI): 53-64

[8]

MaoK-m, DengZ-F. Lung nodule image classification based on ensemble machine learning [J]. Journal of Medical Imaging and Health Informatics, 2016, 6(7): 1679-1685

[9]

ChenC-j, DantchevaA, RossA. An ensemble of patch-based subspaces for makeup-robust face recognition [J]. Information Fusion, 2016, 32: 80-92

[10]

YamanM A, SubasiA, RattayF. Comparison of random subspace and voting ensemble machine learning methods for face recognition [J]. Symmetry, 2018, 10(11): 651

[11]

KrawczykB, MinkuL L, GamaJ, StefanowskiJ, WozniakM. Ensemble learning for data stream analysis: A survey [J]. Information Fusion, 2017, 37: 132-156

[12]

PietruczukL, RutkowskiL, JaworskiM, DudaP. How to adjust an ensemble size in stream data mining? [J]. Information Sciences, 2017, 38146-54

[13]

TumerK, GhoshJ. Analysis of decision boundaries in linearly combined neural classifiers [J]. Pattern Recognition, 1996, 29(2): 341-348

[14]

BrownG, WyattJ, HarrisR, YaoX. Diversity creation methods: A survey and categorization [J]. Information Fusion, 2005, 6(1): 5-20

[15]

NgocP V, NgocC V T, NgocT V T, DuyD N. A C4.5 algorithm for english emotional classification [J]. Evolving Systems, 2019, 10(3): 425-451

[16]

RahbariD, NickrayM. Task offloading in mobile fog computing by classification and regression tree [J]. Peer-to-peer Networking and Applications, 2020, 13(1): 104-122

[17]

RaoH-d, ShiX-z, RodrigueA K, FengJ, XiaY-c, ElhosenyM, YuanX-h, GuL-C. Feature selection based on artificial bee colony and gradient boosting decision tree[J]. Applied Soft Computing, 2019, 74: 634-642

[18]

LI Mu-jin, XU Hong-hui, DENG Yong. Evidential decision tree based on belief entropy [J]. Entropy, 2019, 21(9). DOI:https://doi.org/10.3390/e21090897.

[19]

BreimanL. Random forests [J]. Machine Learning, 2001, 45(1): 5-32

[20]

SchonlauM, ZouR Y. The random forest algorithm for statistical learning [J]. The Stata Journal, 2020, 20(1): 3-29

[21]

RodríguezJ J, KunchevaL I, AlonsoC J. Rotation forest: A new classifier ensemble method [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2006, 28(10): 1619-1630

[22]

BlaserR, FryzlewiczP. Random rotation ensembles [J]. Journal of Machine Learning Research, 2016, 17(1): 1-262021-1-12/

[23]

PhamH, OlafssonS. Bagged ensembles with tunable parameters [J]. Computational Intelligence, 2019, 35(1): 184-203

[24]

HansenL K, SalamonP. Neural network ensembles[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1990, 12(10): 993-1001

[25]

TamaB A, RheeK H. Tree-based classifier ensembles for early detection method of diabetes: an exploratory study [J]. Artificial Intelligence Review, 2019, 51(3): 355-370

[26]

ZhouZ-h, WuJ-x, TangW. Ensembling neural networks: Many could be better than all [J]. Artificial Intelligence, 2002, 137(1): 239-263 2

[27]

ZENG Xiao-dong, WONG D F, CHAO L S. Constructing better classifier ensemble based on weighted accuracy and diversity measure [J]. The Scientific World Journal, 2014: 961747. DOI: https://doi.org/10.1155/2014/961747.

[28]

PeimankarA, WeddellS J, JalalT, LapthornA C. Multi-objective ensemble forecasting with an application to power transformers [J]. Applied Soft Computing, 2018, 68: 233-248

[29]

PetrowskiAA clearing procedure as a niching method for genetic algorithms [C], 1996, Nagoya Japan, IEEE, 798803

[30]

MengshoelO J, GoldbergD E. The crowding approach to niching in genetic algorithms [J]. Evolutionary Computation, 2008, 16(3): 315-354

[31]

GoldbergD E, RichardsonJGenetic algorithms with sharing for multimodal function optimization [C], 1987, Cambridge, MA, USA, L. Erlbaum Associates, 4149[2021-01-13]

[32]

LiJ-p, BalazsM E, ParksG T, ClarksonP J. A species conserving genetic algorithm for multimodal function optimization [J]. Evolutionary Computation, 2002, 10(3): 207-234

[33]

ThomsenRMultimodal optimization using crowding-based differential evolution [C], 2004, Portland, USA, IEEE, 13821389

[34]

LiX-DEfficient differential evolution using speciation for multimodal function optimization [C], 2005, New York, USA, Association for Computing Machinery, 873880

[35]

LiW, FanY-c, XuQ-Z. Evolutionary multimodal optimization based on bi-population and multimutation differential evolution [J]. International Journal of Computational Intelligence Systems, 2020, 13(1): 1345-1367

[36]

WangZ-j, ZhanZ-h, LinY, YuW-j, WangH, KwongS, ZhangJ. Automatic niching differential evolution with contour prediction approach for multimodal optimization problems [J]. IEEE Transactions on Evolutionary Computation, 2020, 24(1): 114-128

[37]

LiuQ-x, DuS-z, van WykB J, SunY-X. Niching particle swarm optimization based on Euclidean distance and hierarchical clustering for multimodal optimization [J]. Nonlinear Dynamics, 2020, 99(3): 2459-2477

[38]

WangZ-j, ZhanZ-h, LinY, YuW-j, YuanH-q, GuT-l, KwongS, ZhangJ. Dual-strategy differential evolution with affinity propagation clustering for multimodal optimization problems [J]. IEEE Transactions on Evolutionary Computation, 2018, 22(6): 894-908

[39]

LiuQ-x, DuS-z, van WykB J, SunY-X. Double-layer-clustering differential evolution multimodal optimization by speciation and self-adaptive strategies [J]. Information Sciences, 2021, 545: 465-486

[40]

ZhaoH, ZhanZ-h, ZhangJAdaptive guidance-based differential evolution with iterative feedback archive strategy for multimodal optimization problems [C], 2020, Glasgow, United Kingdom, IEEE

[41]

CaoY-l, ZhangH, LiW-f, ZhouM-c, ZhangY, ChaovalitwongseW A. Comprehensive learning particle swarm optimization algorithm with local search for multimodal functions [J]. IEEE Transactions on Evolutionary Computation, 2019, 23(4): 718-731

[42]

RimC, PiaoS, LiG, PakU. A niching chaos optimization algorithm for multimodal optimization [J]. Soft Computing, 2018, 22(2): 621-633

[43]

ThirugnanasambandamK, PrakashS, SubramanianV, PothulaS, ThirumalV. Reinforced cuckoo search algorithm-based multimodal optimization [J]. Applied Intelligence, 2019, 49(6): 2059-2083

[44]

SunT, ZhouZ-H. Structural diversity for decision tree ensemble learning [J]. Frontiers of Computer Science, 2018, 12(3): 560-570

[45]

DubrulleA A. Householder transformations revisited [J]. SIAM Journal on Matrix Analysis and Applications, 2000, 22(1): 33-40

[46]

Rodriguez-GalianoV, Sanchez-CastilloM, Chica-OlmoM, Chica-RivasM. Machine learning predictive models for mineral prospectivity: an evaluation of neural networks, random forest, regression trees and support vector machines [J]. Ore Geology Reviews, 2015, 71(SI): 804-818

[47]

BreimanL, FriedmanJ H, OlshenR A, StoneC JClassification and regression trees [M], 1984, Belmont, CA, Wadsworth Advanced Books and Software

[48]

ChenW, XieX-s, WangJ-l, PradhanB, HongH-y, BuiD T, DuanZ, MaJ-Q. A comparative study of logistic model tree, random forest, and classification and regression tree models for spatial prediction of landslide susceptibility [J]. Catena, 2017, 151: 147-160

[49]

DUA D, GRAFF C. UCI Machine learning repository [EB/OL]. [2020-01-13] http://archive.ics.uci.edu/ml,2017/.

[50]

CoxN J, SchechterC B. Speaking stata: how best to generate indicator or dummy variables [J]. Stata Journal, 2019, 19(1): 246-259

[51]

JanezD. Statistical comparisons of classifiers over multiple data sets[J]. Journal of Machine Learning Research, 2006, 7: 1-30

[52]

NishiyamaT, SeoT. The multivariate tukey-kramer multiple comparison procedure among four correlated mean vectors [J]. American Journal of Mathematical and Management Sciences, 2008, 28(1): 115-130 2

[53]

KunchevaL I. A bound on kappa-error diagrams for analysis of classifier ensembles [J]. IEEE Transactions on Knowledge and Data Engineering, 2013, 25(3): 494-501

[54]

CohenJ. A coefficient of agreement for nominal scales [J]. Educational and Psychological Measurement, 1960, 20(1): 37-46

[55]

FlightL, JuliousS A. The disagreeable behaviour of the kappa statistic [J]. Pharmaceutical Statistics, 2015, 14(1): 74-78

AI Summary AI Mindmap
PDF

127

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/