Multi-task analysis discriminative dictionary learning for one-class learning
作者:
Highlights:
•
摘要
One-class classification is a generalization of supervised learning based on one class of examples. It attracts growing attention in machine learning and data mining. In this paper, we propose a novel approach called multi-task dictionary learning for one-class learning (MTD-OC), which incorporates analysis discriminative dictionary learning into one-class learning. The analysis discriminative dictionary learning makes sure that dictionaries responding to different tasks are independent and discriminating as much as possible. The analysis discriminative dictionary learning simultaneously minimize -norm constraint, analysis incoherence term and sparse code extraction term, which aim to promote analysis incoherence and improve coding efficiency and accuracy for classification. The one-class classifier on the target task is then constructed by learning transfer knowledge from multiple source tasks. Here, one-class classification improves the performance of analysis discriminative dictionary, while analysis discriminative dictionary improves the performance of one-class classification term. In MTD-OC, the optimization function is formulated to deal with one-class classifier and analysis discriminative dictionary learning based on one class of examples. Then, we propose an iterative framework to solve the optimization function, and obtain the predictive classifier for the target class. Extensive experiments have shown that MTD-OC can improve the accuracy of one-class classifier by learning analysis discriminative dictionary from each task to construct a transfer classifier.
论文关键词:Multi-task learning,One-class classifier,Dictionary learning
论文评审过程:Received 27 July 2020, Revised 20 April 2021, Accepted 2 June 2021, Available online 5 June 2021, Version of Record 8 June 2021.
论文官网地址:https://doi.org/10.1016/j.knosys.2021.107195