Evolving data-adaptive support vector machines for binary classification
作者:
Highlights:
•
摘要
Support vector machines (SVMs) have been exploited in a plethora of real-life classification and regression tasks, and are one of the most researched supervised learners. However, their generalization abilities strongly depend on the pivotal hyperparameters of the classifier, alongside its training dataset. Also, the training process is computationally and memory expensive, hence learning multiple SVMs to grid-search the hyperparameter space is infeasible in practice. In this paper, we address the problem of optimizing SVMs for binary classification of difficult datasets, including very large and extremely imbalanced cases. We propose an evolutionary technique that simultaneously optimizes the critical SVM aspects, including the training sample, kernel functions, and features. Also, we introduce a co-evolutionary scheme that allows us to guide the search in a competitive way to the highest-quality solutions. Our extensive experimental study performed over more than 120 benchmarks showed that the proposed algorithm outperforms popular supervised learners, as well as other techniques for optimizing SVMs reported in the literature.
论文关键词:Support vector machine,Evolutionary algorithm,Training set selection,Model optimization,Kernel function
论文评审过程:Received 9 September 2020, Revised 23 April 2021, Accepted 9 June 2021, Available online 15 June 2021, Version of Record 18 June 2021.
论文官网地址:https://doi.org/10.1016/j.knosys.2021.107221