Robust feature selection based on regularized brownboost loss

作者:

Highlights:

摘要

Feature selection is an important preprocessing step in machine learning and pattern recognition. It is also a data mining task in some real-world applications. Feature quality evaluation is a key issue when designing an algorithm for feature selection. The classification margin has been used widely to evaluate feature quality in recent years. In this study, we introduce a robust loss function, called Brownboost loss, which computes the feature quality and selects the optimal feature subsets to enhance robustness. We compute the classification loss in a feature space with hypothesis-margin and minimize the loss by optimizing the weights of features. An algorithm is developed based on gradient descent using L2-norm regularization techniques. The proposed algorithm is tested using UCI datasets and gene expression datasets, respectively. The experimental results show that the proposed algorithm is effective in improving the classification robustness.

论文关键词:Feature selection,Margin,Robustness,Brownboost loss,Regularization

论文评审过程:Received 10 September 2012, Revised 22 July 2013, Accepted 5 September 2013, Available online 24 September 2013.

论文官网地址:https://doi.org/10.1016/j.knosys.2013.09.005