Feature subset selection in large dimensionality domains
作者:
Highlights:
•
摘要
Searching for an optimal feature subset from a high dimensional feature space is known to be an NP-complete problem. We present a hybrid algorithm, SAGA, for this task. SAGA combines the ability to avoid being trapped in a local minimum of simulated annealing with the very high rate of convergence of the crossover operator of genetic algorithms, the strong local search ability of greedy algorithms and the high computational efficiency of generalized regression neural networks. We compare the performance over time of SAGA and well-known algorithms on synthetic and real datasets. The results show that SAGA outperforms existing algorithms.
论文关键词:Curse of dimensionality,Feature subset selection,High dimensionality,Dimensionality reduction,ACO,ant colony optimization,GA,genetic algorithm,GRNN,generalized regression neural networks,PSO,particle swarm optimization,SA,simulated annealing,SBS,sequential backward selection,SFBS,sequential floating backward selection,SFFS,sequential floating forward selection,SFS,sequential forward selection
论文评审过程:Received 30 January 2009, Revised 31 May 2009, Accepted 17 June 2009, Available online 24 June 2009.
论文官网地址:https://doi.org/10.1016/j.patcog.2009.06.009