Chunk incremental learning for cost-sensitive hinge loss support vector machine

作者:

Highlights:

• We proposed a chunk incremental learning algorithm for CSHL-SVM (i.e., CICSHL-SVM) that can update a trained model without re-training from scratch when incorporating a chunk of new samples.

• Our method is efficient because it can update the trained model not only for one sample at a time but also for multiple samples at a time.

• The experimental results on a variety of datasets not only confirm the effectiveness of CSHL-SVM but also show that our method is more efficient than the batch algorithm of CSHL-SVM and the single incremental algorithm.

摘要

•We proposed a chunk incremental learning algorithm for CSHL-SVM (i.e., CICSHL-SVM) that can update a trained model without re-training from scratch when incorporating a chunk of new samples.•Our method is efficient because it can update the trained model not only for one sample at a time but also for multiple samples at a time.•The experimental results on a variety of datasets not only confirm the effectiveness of CSHL-SVM but also show that our method is more efficient than the batch algorithm of CSHL-SVM and the single incremental algorithm.

论文关键词:Cost-sensitive learning,Chunk incremental learning,Hinge loss,Support vector machines

论文评审过程:Received 2 September 2017, Revised 11 March 2018, Accepted 21 May 2018, Available online 29 May 2018, Version of Record 4 June 2018.

论文官网地址:https://doi.org/10.1016/j.patcog.2018.05.023