Pruning boosted classifiers with a real valued genetic algorithm

作者:

Highlights:

摘要

Ensemble classifiers and algorithms for learning ensembles have recently received a great deal of attention in the machine learning literature (R.E. Schapire, Machine Learning 5(2) (1990) 197–227;N. Cesa-Bianchi, Y. Freund, D. Haussler, D.P. Helbold, R.E. Schapire, M.K. Warmuth, Proceedings of the 25th Annual ACM Symposium on the Theory of Computing, 1993, pp. 382–391; L. Breiman, Bias, Technical Report 460, Statistics Department, University of California, Berkeley, CA, 1996; J.R. Quinlan, Proceedings of the 14th International Conference on Machine Learning, Italy, 1997; Y. Freund, R.E. Schapire, Proceedings of the 13th International Conference on Machine Learning ICML96, Bari, Italy 1996, pp. 148–157; A.J.C. Sharkey, N.E. Sharkey, Combining diverse neural nets, The Knowledge Engineering Review 12 (3) (1997) 231–247). In particular, boosting has received a great deal of attention as a mechanism by which an ensemble of classifiers that has a better generalisation characteristic than any single classifier derived using a particular technique can be discovered. In this article, we examine and compare a number of techniques for pruning a classifier ensemble which is overfit on its training set and find that a real valued GA is at least as good as the best heuristic search algorithm for choosing an ensemble weighting.

论文关键词:Ensemble classifier,Pruning boosted classifiers,Pruning algorithm,Backfitting procedure

论文评审过程:Received 12 February 1999, Accepted 17 March 1999, Available online 23 August 1999.

论文官网地址:https://doi.org/10.1016/S0950-7051(99)00023-4