Feature Nonlinear Transformation Non-Negative Matrix Factorization with Kullback-Leibler Divergence
作者:
Highlights:
• A new non-negative matrix factorization decomposition model is proposed.
• A new non-negative matrix factorization method, calledFeature Nonlinear Transformation Non-Negative Matrix Factorization with Kullback-Leibler Divergence (FNTNMF-KLD) is proposed.
• New iterative update rules for basis matrix and feature matrix are derived strictly.
• A proof of algorithm convergence is provided.
• There are higher accuracies in object recognition and clustering.
摘要
•A new non-negative matrix factorization decomposition model is proposed.•A new non-negative matrix factorization method, calledFeature Nonlinear Transformation Non-Negative Matrix Factorization with Kullback-Leibler Divergence (FNTNMF-KLD) is proposed.•New iterative update rules for basis matrix and feature matrix are derived strictly.•A proof of algorithm convergence is provided.•There are higher accuracies in object recognition and clustering.
论文关键词:Non-negative matrix factorization,Nonlinear transformation,Feature extraction,Object recognition,Clustering,Kullback-Leibler divergence
论文评审过程:Received 17 August 2020, Revised 6 August 2021, Accepted 14 July 2022, Available online 16 July 2022, Version of Record 30 July 2022.
论文官网地址:https://doi.org/10.1016/j.patcog.2022.108906