Statistical performance of convex low-rank and sparse tensor recovery
作者:
Highlights:
•
摘要
Low-rank or sparse tensor recovery finds many applications in computer vision and machine learning. The recently proposed regularized multilinear regression and selection (Remurs) model assumes the true tensor to be simultaneously low-Tucker-rank and sparse, and has been successfully applied in fMRI analysis. However, the statistical performance of Remurs-like models is still lacking. To address this problem, a minimization problem based on a newly defined tensor nuclear-l1-norm is proposed, to recover a simultaneously low-Tucker-rank and sparse tensor from its degraded observations. Then, an M-ADMM-based algorithm is developed to efficiently solve the problem. Further, the statistical performance is analyzed by establishing a deterministic upper bound on the estimation error for general noise. Also, under Gaussian noise, non-asymptotic upper bounds for two specific settings, i.e., noisy tensor decomposition and random Gaussian design, are given. Experiments on synthetic datasets demonstrate that the proposed theorems can precisely predict the scaling behavior of the estimation error.
论文关键词:Tensor recovery,Statistical performance,Tucker rank,Tensor de-noising,Tensor compressive sensing
论文评审过程:Received 1 March 2018, Revised 20 September 2018, Accepted 19 March 2019, Available online 20 March 2019, Version of Record 30 April 2019.
论文官网地址:https://doi.org/10.1016/j.patcog.2019.03.014