Learning Imbalanced Classifiers Locally and Globally with One-Side Probability Machine
作者:Kaizhu Huang, Rui Zhang, Xu-Cheng Yin
摘要
We consider the imbalanced learning problem, where the data associated with one class are far fewer than those associated with the other class. Current imbalanced learning methods often handle this problem by adapting certain intermediate parameters so as to impose a bias on the minority data. However, most of these methods are in rigorous and need to adapt those factors via the trial-and-error procedure. Recently, a new model called Biased Minimax Probability Machine (BMPM) presents a rigorous and systematic work and has demonstrated very promising performance on imbalance learning. Despite its success, BMPM exclusively relies on global information, namely, the first order and second order data information; such information might be however unreliable, especially for the minority data. In this paper, we propose a new model called One-Side Probability Machine (OSPM). Different from the previous approaches, OSPM can lead to rigorous treatment on biased classification tasks. Importantly, the proposed OSPM exploits the reliable global information from one side only, i.e., the majority class, while engaging the robust local learning from the other side, i.e., the minority class. To our best knowledge, OSPM presents the first model capable of learning data both locally and globally. Our proposed model has also established close connections with various famous models such as BMPM, Support Vector Machine, and Maxi-Min Margin Machine. One appealing feature is that the optimization problem involved in the novel OSPM model can be cast as a convex second order conic programming problem with the global optimum guaranteed. A series of experimental results on three data sets demonstrate the advantages of our proposed methods over four competitive approaches.
论文关键词:Learning locally and globally, Imbalanced learning , Classification
论文评审过程:
论文官网地址:https://doi.org/10.1007/s11063-014-9370-9