Global convergence of Negative Correlation Extreme Learning Machine

作者:Carlos Perales-González

摘要

Ensemble approaches introduced in the Extreme Learning Machine literature mainly come from methods that rely on data sampling procedures, under the assumption that the training data are heterogeneously enough to set up diverse base learners. To overcome this assumption, it was proposed an ELM ensemble method based on the Negative Correlation Learning framework, called Negative Correlation Extreme Learning Machine (NCELM). This model works in two stages: (i) different ELMs are generated as base learners with random weights in the hidden layer, and (ii) a NCL penalty term with the information of the ensemble prediction is introduced in each ELM minimization problem, updating the base learners, (iii) second step is iterated until the ensemble converges. Although this NCL ensemble method was validated by an experimental study with multiple benchmark datasets, no information was given on the conditions about this convergence. This paper mathematically presents sufficient conditions to guarantee the global convergence of NCELM. The update of the ensemble in each iteration is defined as a contraction mapping function, and through Banach theorem, global convergence of the ensemble is proved.

论文关键词:Ensemble, Negative correlation learning, Extreme learning machine, Fixed-point, Banach, Contraction mapping

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-021-10492-z