An accuracy-maximization learning framework for supervised and semi-supervised imbalanced data
作者:
Highlights:
•
摘要
While we attempt to develop the balanced error rate (BER) minimization learning framework for randomized learning of feedforward neural networks to deal with imbalanced datasets, it remains unclear whether the BER minimization learning framework can be effectively extended into its semi-supervised version. This paper proposes a new concept of accuracy maximization for randomized learning methods on imbalanced datasets for the first time, and theoretically proves that it is equivalent to the minimization of the generalized BER for the use of the selected neural networks. In particular, accuracy maximization can be easily extended to semi-supervised scenarios as its semi-supervised version is proved to be linearly dependent on its original. In this paper, based on the proposed accuracy maximization concept, we propose an accuracy-maximization learning framework, and further develop a new accuracy-maximization extreme learning machine AMELM by taking Extreme Learning Machine (ELM) as a typical randomized learning method for feedforward neural networks so as to handle challenging data issues such as the class imbalance and label scarcity. It is worth noting that the proposed accuracy maximization based framework is not only suitable for ELM, but can be extended to different randomized learning methods, such as Random Vector Functional Link Network (RVFL), and Schmidt Neural Network (SNN) for supervised and semi-supervised imbalanced data. The efficacy of AMELM is tested on extensive benchmark datasets. Experimental results show that AMELM can achieve satisfactory performances on labeled or partially labeled imbalanced data. Also, AMELM obtains at least comparable classification performance to other baseline methods yet has fewer hyperparameters to tune, showing its potential for practical applications.
论文关键词:BER minimization,Imbalance learning,Semi-supervised learning,Extreme learning machine
论文评审过程:Received 25 May 2022, Revised 1 August 2022, Accepted 11 August 2022, Available online 17 August 2022, Version of Record 5 September 2022.
论文官网地址:https://doi.org/10.1016/j.knosys.2022.109678