An Ensemble Multi-Model Voting Method for Adapting to Concept Drift

Authors

  • Min Wang

DOI:

https://doi.org/10.54097/6m7rrt90

Keywords:

Streaming Data Mining, Integration Methods, Voting Mechanisms, Concept Drift

Abstract

Aiming at the challenges posed by concept drift in streaming data mining, this paper proposes an Ensemble Multi-Model Voting Method for Adapting to Concept Drift (EMVM_ATCD). The method employs integrated multi-classifiers to improve model stability, uses online learning methods to update the model, and adds a dropout layer to force the model to learn different combinations to enhance generalization ability. A voting mechanism is used to process the model prediction results to enhance the ability to cope with concept drift. Experimental results show that the method achieves performance improvements ranging from 0.1% to 10% on multiple datasets, proving that it can effectively handle various types of data.

Downloads

Download data is not yet available.

References

[1] RUTKOWSKI L, JAWORSKI M, DUDA P.Stream data mining: algorithms and their probabilistic properties[M]. Germany: Springer Publishing Company, 2020:13-33.

[2] GABER M M, ZASLAVSKY A, KRISHNASWAMY S.A survey of classification methods in data streams[J].Data Streams: Models and Algorithms, 2007, 31: 39-59.

[3] LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998, 86(11): 2278-2324.

[4] ELMAN J L. Finding structure in time[J]. Cognitive science, 1990, 14(2): 179-211.

[5] HOCHREITER S. Long Short-term Memory[J]. Neural Computation MIT-Press, 1997, 9(8), 1735-1780.

[6] KRIZHEVSKY A, SUTSKEVER I, HINTON G E. ImageNet classification with deep convolutional neural networks[J]. Communications of the ACM, 2017, 60(6): 84-90.

[7] Xu, R.; Cheng, Y.; Liu, Z.; Xie, Y.; Yang, Y. Improved Long Short-Term Memory based anomaly detection with concept drift adaptive method for supporting IoT services. Futur. Gener. Comput. Syst. 2020, 112, 228–242.

[8] Soleymani, F.; Paquet, E. Financial portfolio optimization with online deep reinforcement learning and restricted stacked autoencoder—DeepBreath. Expert Syst. Appl. 2020, 156, 113456.

[9] Wang, X.; Chen, W.; Xia, J.; Chen, Z.; Xu, D.; Wu, X.; Xu, M.; Schreck, T. ConceptExplorer: Visual Analysis of Concept Drifts in Multi-source Time-series Data. In Proceedings of the IEEE Conference on Visual Analytics Science and Technology (VAST), Salt Lake City, UT, USA, 25–30 October 2020; pp. 1–11. [CrossRef]

[10] Huang G B, Zhu Q Y, Siew C K. Extreme learning machine: theory and applications[J]. Neurocomputing, 2006, 70(1-3): 489-501.

[11] Rumelhart D E, Hinton G E, Williams R J. Learning representations by back-propagating errors[J]. nature, 1986, 323(6088): 533-536.

[12] Bernardo A, Della Valle E. Smote-ob: Combining smote and online bagging for continuous rebalancing of evolving data streams[C]//2021 IEEE International Conference on Big Data (Big Data). IEEE, 2021: 5033-5042.

[13] Lv Y, Peng S, Yuan Y, et al. A classifier using online bagging ensemble method for big data stream learning[J]. Tsinghua science and technology, 2019, 24(4): 379-388.

[14] Bayram B, Köroğlu B, Gönen M. Improving fraud detection and concept drift adaptation in credit card transactions using incremental gradient boosting trees[C]//2020 19th IEEE International Conference on Machine Learning and Applications (ICMLA). IEEE, 2020: 545-550.

[15] Wang K, Lu J, Liu A J, et al. Evolving gradient boost: A pruning scheme based on loss improvement ratio for learning under concept drift [J]. IEEE Transactions on Cybernetics, 2021, 151: 1-14.

[16] Wang K, Lu J, Liu A J, et al. Elastic gradient boosting decision tree with adaptiveiterations for concept drift adaptation [J]. Neurocomputing, 2022, 491: 288-304.

[17] Kang Q, Chen X S, Li S S, et al. A noise-filtered under-sampling scheme for imbalanced classification[J]. IEEE transactions on cybernetics, 2016, 47(12): 4263-4274.

[18] BRODERSEN K H, ONG C S, Stephan K E, et al. The balanced accuracy and its posterior distribution[C]//2010 20th international conference on pattern recognition. IEEE, 2010: 3121-3124.

Downloads

Published

29-04-2025

Issue

Section

Articles

How to Cite

Wang, M. (2025). An Ensemble Multi-Model Voting Method for Adapting to Concept Drift. Frontiers in Computing and Intelligent Systems, 12(1), 210-215. https://doi.org/10.54097/6m7rrt90